Search results for: rapid compression machine
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6058

Search results for: rapid compression machine

5338 Dynamic Web-Based 2D Medical Image Visualization and Processing Software

Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail

Abstract:

In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.

Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN

Procedia PDF Downloads 157
5337 Alternator Fault Detection Using Wigner-Ville Distribution

Authors: Amin Ranjbar, Amir Arsalan Jalili Zolfaghari, Amir Abolfazl Suratgar, Mehrdad Khajavi

Abstract:

This paper describes two stages of learning-based fault detection procedure in alternators. The procedure consists of three states of machine condition namely shortened brush, high impedance relay and maintaining a healthy condition in the alternator. The fault detection algorithm uses Wigner-Ville distribution as a feature extractor and also appropriate feature classifier. In this work, ANN (Artificial Neural Network) and also SVM (support vector machine) were compared to determine more suitable performance evaluated by the mean squared of errors criteria. Modules work together to detect possible faulty conditions of machines working. To test the method performance, a signal database is prepared by making different conditions on a laboratory setup. Therefore, it seems by implementing this method, satisfactory results are achieved.

Keywords: alternator, artificial neural network, support vector machine, time-frequency analysis, Wigner-Ville distribution

Procedia PDF Downloads 367
5336 Rapid-Access Multispecialty Nurse-Led Tongue Tie Service: A Retrospective Evaluation of Cost-Effectiveness

Authors: Jia Yin Tan, Daniel Rambei, Kate Mann, Samuel price, Ahmed Aboelela

Abstract:

Introduction: Breastfeeding is a complex process, influenced by various factors. Tongue-tie may lead to breastfeeding difficulties due to an inability to suck effectively, causing sore nipples and poor infant weight gain. In the UK, most frenotomies on infants are performed by doctors, nurses, health visitors or midwives. Objectives: Evaluation of safety and efficacy of a multispecialty nurse-led rapid access tongue-tie service at Sheffield Children’s Hospital, run jointly by the ENT and paediatric surgery departments. Methodology: A retrospective observational study, including all patients attending the ENT and paediatric surgery nurse-led tongue tie clinics between 1/10/2021 and 30/09/2022. Results: During the study period there were 1135 referrals for frenotomy, with a mean of 15 days between referral to clinic episode. 86.8% of referred patients underwent frenotomy, with a complication rate of 0.1% and revision rate of 5.4%. Conclusions: Our findings suggest that our rapid access nurse-led outpatient tongue tie service is safe and efficacious, with low complication and revision rates. This suggests a potential for developing a community-based service, allowing safe and effective care closer to home.

Keywords: tongue tie, frenotomy, cost, nurse-led

Procedia PDF Downloads 106
5335 Determination of Water Pollution and Water Quality with Decision Trees

Authors: Çiğdem Bakır, Mecit Yüzkat

Abstract:

With the increasing emphasis on water quality worldwide, the search for and expanding the market for new and intelligent monitoring systems has increased. The current method is the laboratory process, where samples are taken from bodies of water, and tests are carried out in laboratories. This method is time-consuming, a waste of manpower, and uneconomical. To solve this problem, we used machine learning methods to detect water pollution in our study. We created decision trees with the Orange3 software we used in our study and tried to determine all the factors that cause water pollution. An automatic prediction model based on water quality was developed by taking many model inputs such as water temperature, pH, transparency, conductivity, dissolved oxygen, and ammonia nitrogen with machine learning methods. The proposed approach consists of three stages: preprocessing of the data used, feature detection, and classification. We tried to determine the success of our study with different accuracy metrics and the results. We presented it comparatively. In addition, we achieved approximately 98% success with the decision tree.

Keywords: decision tree, water quality, water pollution, machine learning

Procedia PDF Downloads 77
5334 Mechanical Properties, Vibrational Response and Flow-Field Analysis of Staghorn Coral Skeleton, Acropora cervicornis

Authors: Alejandro Carrasco-Pena, Mahmoud Omer, Nina Orlovskaya

Abstract:

The results of studies of microstructure, mechanical behavior, vibrational response, and flow field analysis of critically endangered staghorn coral (Acropora cervicornis) skeletons are reported. The CaCO₃ aragonite structure of a chemically-cleaned coral skeleton of A. cervicornis was studied by optical microscopy and computer tomography. The mechanical behavior was studied using uniaxial compression and Vickers hardness technique. The average maximum stress measured during skeleton uniaxial compression was 10.7 ± 2.24 MPa and Vickers hardness was 3.56 ± 0.31 GPa. The vibrational response of the aragonite structure was studied by micro-Raman spectroscopy, which showed a substantial dependence of the structure on applied compressive stress. The flow-field around a single coral skeleton forming vortices in the wake of the moving skeleton was measured using Particle Image Velocimetry (PIV). The results are important for further analysis of time-dependent mechanical fatigue behavior and predicting the lifetime of staghorn corals.

Keywords: failure, mechanical properties, microstructure, Raman spectroscopy

Procedia PDF Downloads 152
5333 Spinal Hydatidosis: Therapeutic Management of 5 Cases

Authors: Ghoul Rachid Brahim, Trad Khodja Rafik

Abstract:

Vertebral localization of the hydatid cyst is a severe form of bone hydatidosis, is a parasitic infection caused by the larval forms of the tapeworms Echinococcus granulosus, The disease is slowly remaining silent (a long incubation period) which may explain why this pathology is often discovered at the stage of neurological complications. The objective of this study is to recall the clinical and radiological aspects of this condition and the importance of early diagnosis and appropriate management. We report a study of 5 patients with vertebral hydatidosis, four men and one woman, four (04) patients operated in the emergency setting for spinal cord compression (decompression by wide laminectomy with evacuation of intra and extra canal vesicles).Albendazole-based medical treatment is instituted in all patients. Results: The evolution was favorable for three patients, the other two patients reoperated for a local recurrence. Conclusion: Vertebral hydatidosis is a rare condition with a poor prognosis due to the risk of neurological damage, the infiltrating nature of bone lesions, the frequency of relapses and therapeutic difficulties. The only curative method remains surgery, which must aim for complete and large excision of the lesions as if it were a “malignant tumour”.

Keywords: hydatidosis, Echinococcosis granulosus, hydatid cyst, spinal cord compression, laminectomy

Procedia PDF Downloads 94
5332 Digital Architectural Practice as a Challenge for Digital Architectural Technology Elements in the Era of Digital Design

Authors: Ling Liyun

Abstract:

In the field of contemporary architecture, complex forms of architectural works continue to emerge in the world, along with some new terminology emerged: digital architecture, parametric design, algorithm generation, building information modeling, CNC construction and so on. Architects gradually mastered the new skills of mathematical logic in the form of exploration, virtual simulation, and the entire design and coordination in the construction process. Digital construction technology has a greater degree in controlling construction, and ensure its accuracy, creating a series of new construction techniques. As a result, the use of digital technology is an improvement and expansion of the practice of digital architecture design revolution. We worked by reading and analyzing information about the digital architecture development process, a large number of cases, as well as architectural design and construction as a whole process. Thus current developments were introduced and discussed in our paper, such as architectural discourse, design theory, digital design models and techniques, material selecting, as well as artificial intelligence space design. Our paper also pays attention to the representative three cases of digital design and construction experiment at great length in detail to expound high-informatization, high-reliability intelligence, and high-technique in constructing a humane space to cope with the rapid development of urbanization. We concluded that the opportunities and challenges of the shift existed in architectural paradigms, such as the cooperation methods, theories, models, technologies and techniques which were currently employed in digital design research and digital praxis. We also find out that the innovative use of space can gradually change the way people learn, talk, and control information. The past two decades, digital technology radically breaks the technology constraints of industrial technical products, digests the publicity on a particular architectural style (era doctrine). People should not adapt to the machine, but in turn, it’s better to make the machine work for users.

Keywords: artificial intelligence, collaboration, digital architecture, digital design theory, material selection, space construction

Procedia PDF Downloads 132
5331 Advancing Urban Sustainability through Data-Driven Machine Learning Solutions

Authors: Nasim Eslamirad, Mahdi Rasoulinezhad, Francesco De Luca, Sadok Ben Yahia, Kimmo Sakari Lylykangas, Francesco Pilla

Abstract:

With the ongoing urbanization, cities face increasing environmental challenges impacting human well-being. To tackle these issues, data-driven approaches in urban analysis have gained prominence, leveraging urban data to promote sustainability. Integrating Machine Learning techniques enables researchers to analyze and predict complex environmental phenomena like Urban Heat Island occurrences in urban areas. This paper demonstrates the implementation of data-driven approach and interpretable Machine Learning algorithms with interpretability techniques to conduct comprehensive data analyses for sustainable urban design. The developed framework and algorithms are demonstrated for Tallinn, Estonia to develop sustainable urban strategies to mitigate urban heat waves. Geospatial data, preprocessed and labeled with UHI levels, are used to train various ML models, with Logistic Regression emerging as the best-performing model based on evaluation metrics to derive a mathematical equation representing the area with UHI or without UHI effects, providing insights into UHI occurrences based on buildings and urban features. The derived formula highlights the importance of building volume, height, area, and shape length to create an urban environment with UHI impact. The data-driven approach and derived equation inform mitigation strategies and sustainable urban development in Tallinn and offer valuable guidance for other locations with varying climates.

Keywords: data-driven approach, machine learning transparent models, interpretable machine learning models, urban heat island effect

Procedia PDF Downloads 31
5330 Automatic Lead Qualification with Opinion Mining in Customer Relationship Management Projects

Authors: Victor Radich, Tania Basso, Regina Moraes

Abstract:

Lead qualification is one of the main procedures in Customer Relationship Management (CRM) projects. Its main goal is to identify potential consumers who have the ideal characteristics to establish a profitable and long-term relationship with a certain organization. Social networks can be an important source of data for identifying and qualifying leads since interest in specific products or services can be identified from the users’ expressed feelings of (dis)satisfaction. In this context, this work proposes the use of machine learning techniques and sentiment analysis as an extra step in the lead qualification process in order to improve it. In addition to machine learning models, sentiment analysis or opinion mining can be used to understand the evaluation that the user makes of a particular service, product, or brand. The results obtained so far have shown that it is possible to extract data from social networks and combine the techniques for a more complete classification.

Keywords: lead qualification, sentiment analysis, opinion mining, machine learning, CRM, lead scoring

Procedia PDF Downloads 79
5329 The Effect of Surface Modifiers on the Mechanical and Morphological Properties of Waste Silicon Carbide Filled High-Density Polyethylene

Authors: R. Dangtungee, A. Rattanapan, S. Siengchin

Abstract:

Waste silicon carbide (waste SiC) filled high-density polyethylene (HDPE) with and without surface modifiers were studied. Two types of surface modifiers namely; high-density polyethylene-grafted-maleic anhydride (HDPE-g-MA) and 3-aminopropyltriethoxysilane have been used in this study. The composites were produced using a two roll mill, extruder and shaped in a hydraulic compression molding machine. The mechanical properties of polymer composites such as flexural strength and modulus, impact strength, tensile strength, stiffness and hardness were investigated over a range of compositions. It was found that, flexural strength and modulus, tensile modulus and hardness increased, whereas impact strength and tensile strength decreased with the increasing in filler contents, compared to the neat HDPE. At similar filler content, the effect of both surface modifiers increased flexural modulus, impact strength, tensile strength and stiffness but reduced the flexural strength. Morphological investigation using SEM revealed that the improvement in mechanical properties was due to enhancement of the interfacial adhesion between waste SiC and HDPE.

Keywords: high-density polyethylene, HDPE-g-MA, mechanical properties, morphological properties, silicon carbide, waste silicon carbide

Procedia PDF Downloads 359
5328 Tracing Back the Bot Master

Authors: Sneha Leslie

Abstract:

The current situation in the cyber world is that crimes performed by Botnets are increasing and the masterminds (botmaster) are not detectable easily. The botmaster in the botnet compromises the legitimate host machines in the network and make them bots or zombies to initiate the cyber-attacks. This paper will focus on the live detection of the botmaster in the network by using the strong framework 'metasploit', when distributed denial of service (DDOS) attack is performed by the botnet. The affected victim machine will be continuously monitoring its incoming packets. Once the victim machine gets to know about the excessive count of packets from any IP, that particular IP is noted and details of the noted systems are gathered. Using the vulnerabilities present in the zombie machines (already compromised by botmaster), the victim machine will compromise them. By gaining access to the compromised systems, applications are run remotely. By analyzing the incoming packets of the zombies, the victim comes to know the address of the botmaster. This is an effective and a simple system where no specific features of communication protocol are considered.

Keywords: bonet, DDoS attack, network security, detection system, metasploit framework

Procedia PDF Downloads 250
5327 New Machine Learning Optimization Approach Based on Input Variables Disposition Applied for Time Series Prediction

Authors: Hervice Roméo Fogno Fotsoa, Germaine Djuidje Kenmoe, Claude Vidal Aloyem Kazé

Abstract:

One of the main applications of machine learning is the prediction of time series. But a more accurate prediction requires a more optimal model of machine learning. Several optimization techniques have been developed, but without considering the input variables disposition of the system. Thus, this work aims to present a new machine learning architecture optimization technique based on their optimal input variables disposition. The validations are done on the prediction of wind time series, using data collected in Cameroon. The number of possible dispositions with four input variables is determined, i.e., twenty-four. Each of the dispositions is used to perform the prediction, with the main criteria being the training and prediction performances. The results obtained from a static architecture and a dynamic architecture of neural networks have shown that these performances are a function of the input variable's disposition, and this is in a different way from the architectures. This analysis revealed that it is necessary to take into account the input variable's disposition for the development of a more optimal neural network model. Thus, a new neural network training algorithm is proposed by introducing the search for the optimal input variables disposition in the traditional back-propagation algorithm. The results of the application of this new optimization approach on the two single neural network architectures are compared with the previously obtained results step by step. Moreover, this proposed approach is validated in a collaborative optimization method with a single objective optimization technique, i.e., genetic algorithm back-propagation neural networks. From these comparisons, it is concluded that each proposed model outperforms its traditional model in terms of training and prediction performance of time series. Thus the proposed optimization approach can be useful in improving the accuracy of time series forecasts. This proves that the proposed optimization approach can be useful in improving the accuracy of time series prediction based on machine learning.

Keywords: input variable disposition, machine learning, optimization, performance, time series prediction

Procedia PDF Downloads 103
5326 Harmonic Data Preparation for Clustering and Classification

Authors: Ali Asheibi

Abstract:

The rapid increase in the size of databases required to store power quality monitoring data has demanded new techniques for analysing and understanding the data. One suggested technique to assist in analysis is data mining. Preparing raw data to be ready for data mining exploration take up most of the effort and time spent in the whole data mining process. Clustering is an important technique in data mining and machine learning in which underlying and meaningful groups of data are discovered. Large amounts of harmonic data have been collected from an actual harmonic monitoring system in a distribution system in Australia for three years. This amount of acquired data makes it difficult to identify operational events that significantly impact the harmonics generated on the system. In this paper, harmonic data preparation processes to better understanding of the data have been presented. Underlying classes in this data has then been identified using clustering technique based on the Minimum Message Length (MML) method. The underlying operational information contained within the clusters can be rapidly visualised by the engineers. The C5.0 algorithm was used for classification and interpretation of the generated clusters.

Keywords: data mining, harmonic data, clustering, classification

Procedia PDF Downloads 241
5325 An Application to Predict the Best Study Path for Information Technology Students in Learning Institutes

Authors: L. S. Chathurika

Abstract:

Early prediction of student performance is an important factor to be gained academic excellence. Whatever the study stream in secondary education, students lay the foundation for higher studies during the first year of their degree or diploma program in Sri Lanka. The information technology (IT) field has certain improvements in the education domain by selecting specialization areas to show the talents and skills of students. These specializations can be software engineering, network administration, database administration, multimedia design, etc. After completing the first-year, students attempt to select the best path by considering numerous factors. The purpose of this experiment is to predict the best study path using machine learning algorithms. Five classification algorithms: decision tree, support vector machine, artificial neural network, Naïve Bayes, and logistic regression are selected and tested. The support vector machine obtained the highest accuracy, 82.4%. Then affecting features are recognized to select the best study path.

Keywords: algorithm, classification, evaluation, features, testing, training

Procedia PDF Downloads 118
5324 Sandwich Structure Composites: Effect of Kenaf on Mechanical Properties

Authors: Maizatulnisa Othman, Mohamad Bukhari, Zahurin Halim, Souad A. Muhammad, Khalisani Khalid

Abstract:

Sandwich structure composites produced by epoxy core and aluminium skin were developed as potential building materials. Interface bonding between core and skin was controlled by varying kenaf content. Five different weight percentage of kenaf loading ranging from 10 wt% to 50 wt% were employed in the core manufacturing in order to study the mechanical properties of the sandwich composite. Properties of skin aluminium with epoxy were found to be affected by drying time of the adhesive. Mechanical behavior of manufactured sandwich composites in relation with properties of constituent materials was studied. It was found that 30 wt% of kenaf loading contributed to increase the flexural strength and flexural modulus up to 102 MPa and 32 Gpa, respectively. Analysis were done on the flatwise and edgewise compression test. For flatwise test, it was found that 30 wt% of fiber loading could withstand maximum force until 250 kN, with compressive strength results at 96.94 MPa. However, at edgewise compression test, the sandwich composite with same fiber loading only can withstand 31 kN of the maximum load with 62 MPa of compressive strength results.

Keywords: sandwich structure composite, epoxy, aluminium, kenaf fiber

Procedia PDF Downloads 389
5323 Robust Fuzzy PID Stabilizer: Modified Shuffled Frog Leaping Algorithm

Authors: Oveis Abedinia, Noradin Ghadimi, Nasser Mikaeilvand, Roza Poursoleiman, Asghar Poorfaraj

Abstract:

In this paper a robust Fuzzy Proportional Integral Differential (PID) controller is applied to multi-machine power system based on Modified Shuffled Frog Leaping (MSFL) algorithm. This newly proposed controller is more efficient because it copes with oscillations and different operating points. In this strategy the gains of the PID controller is optimized using the proposed technique. The nonlinear problem is formulated as an optimization problem for wide ranges of operating conditions using the MSFL algorithm. The simulation results demonstrate the effectiveness, good robustness and validity of the proposed method through some performance indices such as ITAE and FD under wide ranges operating conditions in comparison with TS and GSA techniques. The single-machine infinite bus system and New England 10-unit 39-bus standard power system are employed to illustrate the performance of the proposed method.

Keywords: fuzzy PID, MSFL, multi-machine, low frequency oscillation

Procedia PDF Downloads 425
5322 A Method for Rapid Evaluation of Ore Breakage Parameters from Core Images

Authors: A. Nguyen, K. Nguyen, J. Jackson, E. Manlapig

Abstract:

With the recent advancement in core imaging systems, a large volume of high resolution drill core images can now be collected rapidly. This paper presents a method for rapid prediction of ore-specific breakage parameters from high resolution mineral classified core images. The aim is to allow for a rapid assessment of the variability in ore hardness within a mineral deposit with reduced amount of physical breakage tests. This method sees its application primarily in project evaluation phase, where proper evaluation of the variability in ore hardness of the orebody normally requires prolong and costly metallurgical test work program. Applying this image-based texture analysis method on mineral classified core images, the ores are classified according to their textural characteristics. A small number of physical tests are performed to produce a dataset used for developing the relationship between texture classes and measured ore hardness. The paper also presents a case study in which this method has been applied on core samples from a copper porphyry deposit to predict the ore-specific breakage A*b parameter, obtained from JKRBT tests.

Keywords: geometallurgy, hyperspectral drill core imaging, process simulation, texture analysis

Procedia PDF Downloads 355
5321 Comparison of Linear Discriminant Analysis and Support Vector Machine Classifications for Electromyography Signals Acquired at Five Positions of Elbow Joint

Authors: Amna Khan, Zareena Kausar, Saad Malik

Abstract:

Bio Mechatronics has extended applications in the field of rehabilitation. It has been contributing since World War II in improving the applicability of prosthesis and assistive devices in real life scenarios. In this paper, classification accuracies have been compared for two classifiers against five positions of elbow. Electromyography (EMG) signals analysis have been acquired directly from skeletal muscles of human forearm for each of the three defined positions and at modified extreme positions of elbow flexion and extension using 8 electrode Myo armband sensor. Features were extracted from filtered EMG signals for each position. Performance of two classifiers, support vector machine (SVM) and linear discriminant analysis (LDA) has been compared by analyzing the classification accuracies. SVM illustrated classification accuracies between 90-96%, in contrast to 84-87% depicted by LDA for five defined positions of elbow keeping the number of samples and selected feature the same for both SVM and LDA.

Keywords: classification accuracies, electromyography, linear discriminant analysis (LDA), Myo armband sensor, support vector machine (SVM)

Procedia PDF Downloads 365
5320 Leveraging SHAP Values for Effective Feature Selection in Peptide Identification

Authors: Sharon Li, Zhonghang Xia

Abstract:

Post-database search is an essential phase in peptide identification using tandem mass spectrometry (MS/MS) to refine peptide-spectrum matches (PSMs) produced by database search engines. These engines frequently face difficulty differentiating between correct and incorrect peptide assignments. Despite advances in statistical and machine learning methods aimed at improving the accuracy of peptide identification, challenges remain in selecting critical features for these models. In this study, two machine learning models—a random forest tree and a support vector machine—were applied to three datasets to enhance PSMs. SHAP values were utilized to determine the significance of each feature within the models. The experimental results indicate that the random forest model consistently outperformed the SVM across all datasets. Further analysis of SHAP values revealed that the importance of features varies depending on the dataset, indicating that a feature's role in model predictions can differ significantly. This variability in feature selection can lead to substantial differences in model performance, with false discovery rate (FDR) differences exceeding 50% between different feature combinations. Through SHAP value analysis, the most effective feature combinations were identified, significantly enhancing model performance.

Keywords: peptide identification, SHAP value, feature selection, random forest tree, support vector machine

Procedia PDF Downloads 12
5319 Identification and Classification of Medicinal Plants of Indian Himalayan Region Using Hyperspectral Remote Sensing and Machine Learning Techniques

Authors: Kishor Chandra Kandpal, Amit Kumar

Abstract:

The Indian Himalaya region harbours approximately 1748 plants of medicinal importance, and as per International Union for Conservation of Nature (IUCN), the 112 plant species among these are threatened and endangered. To ease the pressure on these plants, the government of India is encouraging its in-situ cultivation. The Saussurea costus, Valeriana jatamansi, and Picrorhiza kurroa have also been prioritized for large scale cultivation owing to their market demand, conservation value and medicinal properties. These species are found from 1000 m to 4000 m elevation ranges in the Indian Himalaya. Identification of these plants in the field requires taxonomic skills, which is one of the major bottleneck in the conservation and management of these plants. In recent years, Hyperspectral remote sensing techniques have been precisely used for the discrimination of plant species with the help of their unique spectral signatures. In this background, a spectral library of the above 03 medicinal plants was prepared by collecting the spectral data using a handheld spectroradiometer (325 to 1075 nm) from farmer’s fields of Himachal Pradesh and Uttarakhand states of Indian Himalaya. The Random forest (RF) model was implied on the spectral data for the classification of the medicinal plants. The 80:20 standard split ratio was followed for training and validation of the RF model, which resulted in training accuracy of 84.39 % (kappa coefficient = 0.72) and testing accuracy of 85.29 % (kappa coefficient = 0.77). This RF classifier has identified green (555 to 598 nm), red (605 nm), and near-infrared (725 to 840 nm) wavelength regions suitable for the discrimination of these species. The findings of this study have provided a technique for rapid and onsite identification of the above medicinal plants in the field. This will also be a key input for the classification of hyperspectral remote sensing images for mapping of these species in farmer’s field on a regional scale. This is a pioneer study in the Indian Himalaya region for medicinal plants in which the applicability of hyperspectral remote sensing has been explored.

Keywords: himalaya, hyperspectral remote sensing, machine learning; medicinal plants, random forests

Procedia PDF Downloads 199
5318 Enhancing Code Security with AI-Powered Vulnerability Detection

Authors: Zzibu Mark Brian

Abstract:

As software systems become increasingly complex, ensuring code security is a growing concern. Traditional vulnerability detection methods often rely on manual code reviews or static analysis tools, which can be time-consuming and prone to errors. This paper presents a distinct approach to enhancing code security by leveraging artificial intelligence (AI) and machine learning (ML) techniques. Our proposed system utilizes a combination of natural language processing (NLP) and deep learning algorithms to identify and classify vulnerabilities in real-world codebases. By analyzing vast amounts of open-source code data, our AI-powered tool learns to recognize patterns and anomalies indicative of security weaknesses. We evaluated our system on a dataset of over 10,000 open-source projects, achieving an accuracy rate of 92% in detecting known vulnerabilities. Furthermore, our tool identified previously unknown vulnerabilities in popular libraries and frameworks, demonstrating its potential for improving software security.

Keywords: AI, machine language, cord security, machine leaning

Procedia PDF Downloads 26
5317 The Lopsided Burden of Non-Communicable Diseases in India: Evidences from the Decade 2004-2014

Authors: Kajori Banerjee, Laxmi Kant Dwivedi

Abstract:

India is a part of the ongoing globalization, contemporary convergence, industrialization and technical advancement that is taking place world-wide. Some of the manifestations of this evolution is rapid demographic, socio-economic, epidemiological and health transition. There has been a considerable increase in non-communicable diseases due to change in lifestyle. This study aims to assess the direction of burden of disease and compare the pressure of infectious diseases against cardio-vascular, endocrine, metabolic and nutritional diseases. The change in prevalence in a ten-year period (2004-2014) is further decomposed to determine the net contribution of various socio-economic and demographic covariates. The present study uses the recent 71st (2014) and 60th (2004) rounds of National Sample Survey. The pressure of infectious diseases against cardio-vascular (CVD), endocrine, metabolic and nutritional (EMN) diseases during 2004-2014 is calculated by Prevalence Rates (PR), Hospitalization Rates (HR) and Case Fatality Rates (CFR). The prevalence of non-communicable diseases are further used as a dependent variable in a logit regression to find the effect of various social, economic and demographic factors on the chances of suffering from the particular disease. Multivariate decomposition technique further assists in determining the net contribution of socio-economic and demographic covariates. This paper upholds evidences of stagnation of the burden of communicable diseases (CD) and rapid increase in the burden of non-communicable diseases (NCD) uniformly for all population sub-groups in India. CFR for CVD has increased drastically in 2004-2014. Logit regression indicates the chances of suffering from CVD and EMN is significantly higher among the urban residents, older ages, females, widowed/ divorced and separated individuals. Decomposition displays ample proof that improvement in quality of life markers like education, urbanization, longevity of life has positively contributed in increasing the NCD prevalence rate. In India’s current epidemiological phase, compression theory of morbidity is in action as a significant rise in the probability of contracting the NCDs over the time period among older ages is observed. Age is found to play a vital contributor in increasing the probability of having CVD and EMN over the study decade 2004-2014 in the nationally representative sample of National Sample Survey.

Keywords: cardio-vascular disease, case-fatality rate, communicable diseases, hospitalization rate, multivariate decomposition, non-communicable diseases, prevalence rate

Procedia PDF Downloads 309
5316 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 194
5315 Comparison of Different Machine Learning Models for Time-Series Based Load Forecasting of Electric Vehicle Charging Stations

Authors: H. J. Joshi, Satyajeet Patil, Parth Dandavate, Mihir Kulkarni, Harshita Agrawal

Abstract:

As the world looks towards a sustainable future, electric vehicles have become increasingly popular. Millions worldwide are looking to switch to Electric cars over the previously favored combustion engine-powered cars. This demand has seen an increase in Electric Vehicle Charging Stations. The big challenge is that the randomness of electrical energy makes it tough for these charging stations to provide an adequate amount of energy over a specific amount of time. Thus, it has become increasingly crucial to model these patterns and forecast the energy needs of power stations. This paper aims to analyze how different machine learning models perform on Electric Vehicle charging time-series data. The data set consists of authentic Electric Vehicle Data from the Netherlands. It has an overview of ten thousand transactions from public stations operated by EVnetNL.

Keywords: forecasting, smart grid, electric vehicle load forecasting, machine learning, time series forecasting

Procedia PDF Downloads 100
5314 Novel Hole-Bar Standard Design and Inter-Comparison for Geometric Errors Identification on Machine-Tool

Authors: F. Viprey, H. Nouira, S. Lavernhe, C. Tournier

Abstract:

Manufacturing of freeform parts may be achieved on 5-axis machine tools currently considered as a common means of production. In particular, the geometrical quality of the freeform parts depends on the accuracy of the multi-axis structural loop, which is composed of several component assemblies maintaining the relative positioning between the tool and the workpiece. Therefore, to reach high quality of the geometries of the freeform parts the geometric errors of the 5 axis machine should be evaluated and compensated, which leads one to master the deviations between the tool and the workpiece (volumetric accuracy). In this study, a novel hole-bar design was developed and used for the characterization of the geometric errors of a RRTTT 5-axis machine tool. The hole-bar standard design is made of Invar material, selected since it is less sensitive to thermal drift. The proposed design allows once to extract 3 intrinsic parameters: one linear positioning and two straightnesses. These parameters can be obtained by measuring the cylindricity of 12 holes (bores) and 11 cylinders located on a perpendicular plane. By mathematical analysis, twelve 3D points coordinates can be identified and correspond to the intersection of each hole axis with the least square plane passing through two perpendicular neighbour cylinders axes. The hole-bar was calibrated using a precision CMM at LNE traceable the SI meter definition. The reversal technique was applied in order to separate the error forms of the hole bar from the motion errors of the mechanical guiding systems. An inter-comparison was additionally conducted between four NMIs (National Metrology Institutes) within the EMRP IND62: JRP-TIM project. Afterwards, the hole-bar was integrated in RRTTT 5-axis machine tool to identify its volumetric errors. Measurements were carried out in real time and combine raw data acquired by the Renishaw RMP600 touch probe and the linear and rotary encoders. The geometric errors of the 5 axis machine were also evaluated by an accurate laser tracer interferometer system. The results were compared to those obtained with the hole bar.

Keywords: volumetric errors, CMM, 3D hole-bar, inter-comparison

Procedia PDF Downloads 382
5313 Failure Analysis of Laminated Veneer Bamboo Dowel Connections

Authors: Niloufar Khoshbakht, Peggi L. Clouston, Sanjay R. Arwade, Alexander C. Schreyer

Abstract:

Laminated veneer bamboo (LVB) is a structural engineered composite made from glued layers of bamboo. A relatively new building product, LVB is currently employed in similar sizes and applications as dimensional lumber. This study describes the results of a 3D elastic Finite Element model for halfhole specimens when loaded in compression parallel-to-grain per ASTM 5764. The model simulates LVB fracture initiation due to shear stresses in the dowel joint and predicts displacement at failure validated through comparison with experimental results. The material fails at 1mm displacement due to in-plane shear stresses. The paper clarifies the complex interactive state of in-plane shear, tension perpendicular-to-grain, and compression parallel-to-grain stresses that form different distributions in the critical zone beneath the bolt hole for half-hole specimens. These findings are instrumental in understanding key factors and fundamental failure mechanisms that occur in LVB dowel connections to help devise safe standards and further LVB product adoption and design.

Keywords: composite, dowel connection, embedment strength, failure behavior, finite element analysis, Moso bamboo

Procedia PDF Downloads 263
5312 Surface-Enhanced Raman Spectroscopy-Based Detection of SARS-CoV-2 Through In Situ One-pot Electrochemical Synthesis of 3D Au-Lysate Nanocomposite Structures on Plasmonic Au Electrodes

Authors: Ansah Iris Baffour, Dong-Ho Kim, Sung-Gyu Park

Abstract:

The ongoing COVID-19 pandemic, caused by the SARS-CoV-2 virus and is gradually shifting to an endemic phase which implies the outbreak is far from over and will be difficult to eradicate. Global cooperation has led to unified precautions that aim to suppress epidemiological spread (e.g., through travel restrictions) and reach herd immunity (through vaccinations); however, the primary strategy to restrain the spread of the virus in mass populations relies on screening protocols that enable rapid on-site diagnosis of infections. Herein, we employed surface enhanced Raman spectroscopy (SERS) for the rapid detection of SARS-CoV-2 lysate on an Au-modified Au nanodimple(AuND)electrode. Through in situone-pot Au electrodeposition on the AuND electrode, Au-lysate nanocomposites were synthesized, generating3D internal hotspots for large SERS signal enhancements within 30 s of the deposition. The capture of lysate into newly generated plasmonic nanogaps within the nanocomposite structures enhanced metal-spike protein contact in 3D spaces and served as hotspots for sensitive detection. The limit of detection of SARS-CoV-2 lysate was 5 x 10-2 PFU/mL. Interestingly, ultrasensitive detection of the lysates of influenza A/H1N1 and respiratory syncytial virus (RSV) was possible, but the method showed ultimate selectivity for SARS-CoV-2 in lysate solution mixtures. We investigated the practical application of the approach for rapid on-site diagnosis by detecting SARS-CoV-2 lysate spiked in normal human saliva at ultralow concentrations. The results presented demonstrate the reliability and sensitivity of the assay for rapid diagnosis of COVID-19.

Keywords: label-free detection, nanocomposites, SARS-CoV-2, surface-enhanced raman spectroscopy

Procedia PDF Downloads 118
5311 A Study on the Accelerated Life Cycle Test Method of the Motor for Home Appliances by Using Acceleration Factor

Authors: Youn-Sung Kim, Mi-Sung Kim, Jae-Kun Lee

Abstract:

This paper deals with the accelerated life cycle test method of the motor for home appliances that demand high reliability. Life Cycle of parts in home appliances also should be 10 years because life cycle of the home appliances such as washing machine, refrigerator, TV is at least 10 years. In case of washing machine, the life cycle test method of motor is advanced for 3000 cycle test (1cycle = 2hours). However, 3000 cycle test incurs loss for the time and cost. Objectives of this study are to reduce the life cycle test time and the number of test samples, which could be realized by using acceleration factor for the test time and reduction factor for the number of sample.

Keywords: accelerated life cycle test, motor reliability test, motor for washing machine, BLDC motor

Procedia PDF Downloads 628
5310 Customized Temperature Sensors for Sustainable Home Appliances

Authors: Merve Yünlü, Nihat Kandemir, Aylin Ersoy

Abstract:

Temperature sensors are used in home appliances not only to monitor the basic functions of the machine but also to minimize energy consumption and ensure safe operation. In parallel with the development of smart home applications and IoT algorithms, these sensors produce important data such as the frequency of use of the machine, user preferences, and the compilation of critical data in terms of diagnostic processes for fault detection throughout an appliance's operational lifespan. Commercially available thin-film resistive temperature sensors have a well-established manufacturing procedure that allows them to operate over a wide temperature range. However, these sensors are over-designed for white goods applications. The operating temperature range of these sensors is between -70°C and 850°C, while the temperature range requirement in home appliance applications is between 23°C and 500°C. To ensure the operation of commercial sensors in this wide temperature range, usually, a platinum coating of approximately 1-micron thickness is applied to the wafer. However, the use of platinum in coating and the high coating thickness extends the sensor production process time and therefore increases sensor costs. In this study, an attempt was made to develop a low-cost temperature sensor design and production method that meets the technical requirements of white goods applications. For this purpose, a custom design was made, and design parameters (length, width, trim points, and thin film deposition thickness) were optimized by using statistical methods to achieve the desired resistivity value. To develop thin film resistive temperature sensors, one side polished sapphire wafer was used. To enhance adhesion and insulation 100 nm silicon dioxide was coated by inductively coupled plasma chemical vapor deposition technique. The lithography process was performed by a direct laser writer. The lift-off process was performed after the e-beam evaporation of 10 nm titanium and 280 nm platinum layers. Standard four-point probe sheet resistance measurements were done at room temperature. The annealing process was performed. Resistivity measurements were done with a probe station before and after annealing at 600°C by using a rapid thermal processing machine. Temperature dependence between 25-300 °C was also tested. As a result of this study, a temperature sensor has been developed that has a lower coating thickness than commercial sensors but can produce reliable data in the white goods application temperature range. A relatively simplified but optimized production method has also been developed to produce this sensor.

Keywords: thin film resistive sensor, temperature sensor, household appliance, sustainability, energy efficiency

Procedia PDF Downloads 68
5309 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection

Authors: Muhammad Ali

Abstract:

Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.

Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection

Procedia PDF Downloads 115