Search results for: top load washing machine
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5459

Search results for: top load washing machine

5159 The Shape Memory Recovery Properties under Load of a Polymer Composite

Authors: Abdul Basit, Gildas Lhostis, Bernard Durand

Abstract:

Shape memory polymers (SMPs) are replacing shape memory alloys (SMAs) in many applications as SMPs have certain superior properties than SMAs. However, SMAs possess some properties like recovery under stress that SMPs lack. SMPs cannot give complete recovery even under a small load. SMPs are initially heated close to their transition temperature (glass transition temperature or the melting temperature). Then force is applied to deform the heated SMP to a specific position. Subsequently, SMP is allowed to cool keeping it deformed. After cooling, SMP gets the temporary shape. This temporary shape can be recovered by heating it again at the same temperature that was given it while heating it initially. As a result, it will recover its original position. SMP can perform unconstrained recovery and constrained recovery, however; under the load, it only recovers partially. In this work, the recovery under the load of an asymmetrical shape memory composite called as CBCM-SMPC has been investigated. It is found that it has the ability to recover under different loads. Under different loads, it shows powerful complete recovery in reference to initial position. This property can be utilized in many applications.

Keywords: shape memory, polymer composite, thermo-mechanical testing, recovery under load

Procedia PDF Downloads 429
5158 Experimental Investigation of Folding of Rubber-Filled Circular Tubes on Energy Absorption Capacity

Authors: MohammadSadegh SaeediFakher, Jafar Rouzegar, Hassan Assaee

Abstract:

In this research, mechanical behavior and energy absorption capacity of empty and rubber-filled brazen circular tubes under quasi-static axial loading are investigated, experimentally. The brazen tubes were cut out of commercially available brazen circular tubes with the same length and diameter. Some of the specimens were filled with rubbers with three different shores and also, an empty tube was prepared. The specimens were axially compressed between two rigid plates in a quasi-static process using a Zwick testing machine. Load-displacement diagrams and energy absorption of the tested tubes were extracted from experimental data. The results show that filling the brazen tubes with rubber causes those to absorb more energy and the energy absorption of specimens are increased by increasing the shore of rubbers. In comparison to the empty tube, the first fold for the rubber-filled tubes occurs at lower load and it can be concluded that the rubber-filled tubes are better energy absorbers than the empty tubes. Also, in contrast with the empty tubes, the tubes that were filled with lower rubber shore deform asymmetrically.

Keywords: axial compression, quasi-static loading, folding, energy absorbers, rubber-filled tubes

Procedia PDF Downloads 426
5157 The Evolving Customer Experience Management Landscape: A Case Study on the Paper Machine Companies

Authors: Babak Mohajeri, Sen Bao, Timo Nyberg

Abstract:

Customer experience is increasingly the differentiator between successful companies and those who struggle. Currently, customer experiences become more dynamic; and they advance with each interaction between the company and a customer. Every customer conversation and any effort to evolve these conversations would be beneficial and should ultimately result in a positive customer experience. The aim of this paper is to analyze the evolving customer experience management landscape and the relevant challenges and opportunities. A case study on the “paper machine” companies is chosen. Hence, this paper analyzes the challenges and opportunities in customer experience management of paper machine companies for the case of “road to steel”. Road to steel shows the journey of steel from raw material to end product (i.e. paper machine in this paper). ALPHA (Steel company) and BETA (paper machine company), are chosen and their efforts to evolve the customer experiences are investigated. Semi-structured interviews are conducted with experts in those companies to identify the challenges and opportunities of the evolving customer experience management from their point of view. The findings of this paper contribute to the theory and business practices in the realm of the evolving customer experience management landscape.

Keywords: Customer Experience Management, Paper Machine , Value Chain Management, Risk Analysis

Procedia PDF Downloads 353
5156 Auto-Tuning of CNC Parameters According to the Machining Mode Selection

Authors: Jenq-Shyong Chen, Ben-Fong Yu

Abstract:

CNC(computer numerical control) machining centers have been widely used for machining different metal components for various industries. For a specific CNC machine, its everyday job is assigned to cut different products with quite different attributes such as material type, workpiece weight, geometry, tooling, and cutting conditions. Theoretically, the dynamic characteristics of the CNC machine should be properly tuned match each machining job in order to get the optimal machining performance. However, most of the CNC machines are set with only a standard set of CNC parameters. In this study, we have developed an auto-tuning system which can automatically change the CNC parameters and in hence change the machine dynamic characteristics according to the selection of machining modes which are set by the mixed combination of three machine performance indexes: the HO (high surface quality) index, HP (high precision) index and HS (high speed) index. The acceleration, jerk, corner error tolerance, oscillation and dynamic bandwidth of machine’s feed axes have been changed according to the selection of the machine performance indexes. The proposed auto-tuning system of the CNC parameters has been implemented on a PC-based CNC controller and a three-axis machining center. The measured experimental result have shown the promising of our proposed auto-tuning system.

Keywords: auto-tuning, CNC parameters, machining mode, high speed, high accuracy, high surface quality

Procedia PDF Downloads 373
5155 A Full-Scale Test of Coping-Girder Integrated Bridge

Authors: Heeyoung Lee, Woosung Bin, Kangseog Seo, Hyojeong Yun, Zuog An

Abstract:

Recently, a new continuous bridge system has been proposed to increase the space under the bridge and to improve aesthetic aspect of the urban area. The main feature of the proposed bridge is to connect steel I-girders and coping by means of prestressed high-strength steel bars and steel plate. The proposed bridge is able to lower the height of the bridge to ensure the workability and efficiency through a reduction of the cost of road construction. This study presents the experimental result of the full-scale connection between steel I-girders and coping under the negative bending moment. The composite behavior is thoroughly examined and discussed under the specific load levels such as service load, factored load and crack load. Structural response showed full composite action until the final load level because no relative displacement between coping and girder was observed. It was also found prestressing force into high-strength bars was able to control tensile stresses of deck slab. This indicated that cracks in deck slab can be controlled by above-mentioned prestressing force.

Keywords: coping, crack, integrated bridge, full-scale test

Procedia PDF Downloads 435
5154 Using TRACE, PARCS, and SNAP Codes to Analyze the Load Rejection Transient of ABWR

Authors: J. R. Wang, H. C. Chang, A. L. Ho, J. H. Yang, S. W. Chen, C. Shih

Abstract:

The purpose of the study is to analyze the load rejection transient of ABWR by using TRACE, PARCS, and SNAP codes. This study has some steps. First, using TRACE, PARCS, and SNAP codes establish the model of ABWR. Second, the key parameters are identified to refine the TRACE/PARCS/SNAP model further in the frame of a steady state analysis. Third, the TRACE/PARCS/SNAP model is used to perform the load rejection transient analysis. Finally, the FSAR data are used to compare with the analysis results. The results of TRACE/PARCS are consistent with the FSAR data for the important parameters. It indicates that the TRACE/PARCS/SNAP model of ABWR has a good accuracy in the load rejection transient.

Keywords: ABWR, TRACE, PARCS, SNAP

Procedia PDF Downloads 191
5153 Assessment of Solar Hydrogen Production in Energetic Hybrid PV-PEMFC System

Authors: H. Rezzouk, M. Hatti, H. Rahmani, S. Atoui

Abstract:

This paper discusses the design and analysis of a hybrid PV-Fuel cell energy system destined to power a DC load. The system is composed of a photovoltaic array, a fuel cell, an electrolyzer and a hydrogen tank. HOMER software is used in this study to calculate the optimum capacities of the power system components that their combination allows an efficient use of solar resource to cover the hourly load needs. The optimal system sizing allows establishing the right balance between the daily electrical energy produced by the power system and the daily electrical energy consumed by the DC load using a 28 KW PV array, a 7.5 KW fuel cell, a 40KW electrolyzer and a 270 Kg hydrogen tank. The variation of powers involved into the DC bus of the hybrid PV-fuel cell system has been computed and analyzed for each hour over one year: the output powers of the PV array and the fuel cell, the input power of the elctrolyzer system and the DC primary load. Equally, the annual variation of stored hydrogen produced by the electrolyzer has been assessed. The PV array contributes in the power system with 82% whereas the fuel cell produces 18%. 38% of the total energy consumption belongs to the DC primary load while the rest goes to the electrolyzer.

Keywords: electrolyzer, hydrogen, hydrogen fueled cell, photovoltaic

Procedia PDF Downloads 485
5152 Assessing the Effectiveness of Machine Learning Algorithms for Cyber Threat Intelligence Discovery from the Darknet

Authors: Azene Zenebe

Abstract:

Deep learning is a subset of machine learning which incorporates techniques for the construction of artificial neural networks and found to be useful for modeling complex problems with large dataset. Deep learning requires a very high power computational and longer time for training. By aggregating computing power, high performance computer (HPC) has emerged as an approach to resolving advanced problems and performing data-driven research activities. Cyber threat intelligence (CIT) is actionable information or insight an organization or individual uses to understand the threats that have, will, or are currently targeting the organization. Results of review of literature will be presented along with results of experimental study that compares the performance of tree-based and function-base machine learning including deep learning algorithms using secondary dataset collected from darknet.

Keywords: deep-learning, cyber security, cyber threat modeling, tree-based machine learning, function-based machine learning, data science

Procedia PDF Downloads 146
5151 Support Vector Machine Based Retinal Therapeutic for Glaucoma Using Machine Learning Algorithm

Authors: P. S. Jagadeesh Kumar, Mingmin Pan, Yang Yung, Tracy Lin Huan

Abstract:

Glaucoma is a group of visual maladies represented by the scheduled optic nerve neuropathy; means to the increasing dwindling in vision ground, resulting in loss of sight. In this paper, a novel support vector machine based retinal therapeutic for glaucoma using machine learning algorithm is conservative. The algorithm has fitting pragmatism; subsequently sustained on correlation clustering mode, it visualizes perfect computations in the multi-dimensional space. Support vector clustering turns out to be comparable to the scale-space advance that investigates the cluster organization by means of a kernel density estimation of the likelihood distribution, where cluster midpoints are idiosyncratic by the neighborhood maxima of the concreteness. The predicted planning has 91% attainment rate on data set deterrent on a consolidation of 500 realistic images of resolute and glaucoma retina; therefore, the computational benefit of depending on the cluster overlapping system pedestal on machine learning algorithm has complete performance in glaucoma therapeutic.

Keywords: machine learning algorithm, correlation clustering mode, cluster overlapping system, glaucoma, kernel density estimation, retinal therapeutic

Procedia PDF Downloads 242
5150 Auto Classification of Multiple ECG Arrhythmic Detection via Machine Learning Techniques: A Review

Authors: Ng Liang Shen, Hau Yuan Wen

Abstract:

Arrhythmia analysis of ECG signal plays a major role in diagnosing most of the cardiac diseases. Therefore, a single arrhythmia detection of an electrocardiographic (ECG) record can determine multiple pattern of various algorithms and match accordingly each ECG beats based on Machine Learning supervised learning. These researchers used different features and classification methods to classify different arrhythmia types. A major problem in these studies is the fact that the symptoms of the disease do not show all the time in the ECG record. Hence, a successful diagnosis might require the manual investigation of several hours of ECG records. The point of this paper presents investigations cardiovascular ailment in Electrocardiogram (ECG) Signals for Cardiac Arrhythmia utilizing examination of ECG irregular wave frames via heart beat as correspond arrhythmia which with Machine Learning Pattern Recognition.

Keywords: electrocardiogram, ECG, classification, machine learning, pattern recognition, detection, QRS

Procedia PDF Downloads 369
5149 Air Quality Analysis Using Machine Learning Models Under Python Environment

Authors: Salahaeddine Sbai

Abstract:

Air quality analysis using machine learning models is a method employed to assess and predict air pollution levels. This approach leverages the capabilities of machine learning algorithms to analyze vast amounts of air quality data and extract valuable insights. By training these models on historical air quality data, they can learn patterns and relationships between various factors such as weather conditions, pollutant emissions, and geographical features. The trained models can then be used to predict air quality levels in real-time or forecast future pollution levels. This application of machine learning in air quality analysis enables policymakers, environmental agencies, and the general public to make informed decisions regarding health, environmental impact, and mitigation strategies. By understanding the factors influencing air quality, interventions can be implemented to reduce pollution levels, mitigate health risks, and enhance overall air quality management. Climate change is having significant impacts on Morocco, affecting various aspects of the country's environment, economy, and society. In this study, we use some machine learning models under python environment to predict and analysis air quality change over North of Morocco to evaluate the climate change impact on agriculture.

Keywords: air quality, machine learning models, pollution, pollutant emissions

Procedia PDF Downloads 88
5148 The Acute Effects of Higher Versus Lower Load Duration and Intensity on Morphological and Mechanical Properties of the Healthy Achilles Tendon: A Randomized Crossover Trial

Authors: Eman Merza, Stephen Pearson, Glen Lichtwark, Peter Malliaras

Abstract:

The Achilles tendon (AT) exhibits volume changes related to fluid flow under acute load which may be linked to changes in stiffness. Fluid flow provides a mechanical signal for cellular activity and may be one mechanism that facilitates tendon adaptation. This study aimed to investigate whether isometric intervention involving a high level of load duration and intensity could maximize the immediate reduction in AT volume and stiffness compared to interventions involving a lower level of load duration and intensity. Sixteen healthy participants (12 males, 4 females; age= 24.4 ± 9.4 years; body mass= 70.9 ± 16.1 kg; height= 1.7 ± 0.1 m) performed three isometric interventions of varying levels of load duration (2 s and 8 s) and intensity (35% and 75% maximal voluntary isometric contraction) over a 3 week period. Freehand 3D ultrasound was used to measure free AT volume (at rest) and length (at 35%, 55%, and 75% of maximum plantarflexion force) pre- and post-interventions. The slope of the force-elongation curve over these force levels represented individual stiffness (N/mm). Large reductions in free AT volume and stiffness resulted in response to long-duration high-intensity loading whilst less reduction was produced with a lower load intensity. In contrast, no change in free AT volume and a small increase in AT stiffness occurred with lower load duration. These findings suggest that the applied load on the AT must be heavy and sustained for a long duration to maximize immediate volume reduction, which might be an acute response that enables optimal long-term tendon adaptation via mechanotransduction pathways.

Keywords: Achilles tendon, volume, stiffness, free tendon, 3d ultrasound

Procedia PDF Downloads 88
5147 Review of Different Machine Learning Algorithms

Authors: Syed Romat Ali Shah, Bilal Shoaib, Saleem Akhtar, Munib Ahmad, Shahan Sadiqui

Abstract:

Classification is a data mining technique, which is recognizedon Machine Learning (ML) algorithm. It is used to classifythe individual articlein a knownofinformation into a set of predefinemodules or group. Web mining is also a portion of that sympathetic of data mining methods. The main purpose of this paper to analysis and compare the performance of Naïve Bayse Algorithm, Decision Tree, K-Nearest Neighbor (KNN), Artificial Neural Network (ANN)and Support Vector Machine (SVM). This paper consists of different ML algorithm and their advantages and disadvantages and also define research issues.

Keywords: Data Mining, Web Mining, classification, ML Algorithms

Procedia PDF Downloads 290
5146 Flood-prone Urban Area Mapping Using Machine Learning, a Case Sudy of M'sila City (Algeria)

Authors: Medjadj Tarek, Ghribi Hayet

Abstract:

This study aims to develop a flood sensitivity assessment tool using machine learning (ML) techniques and geographic information system (GIS). The importance of this study is integrating the geographic information systems (GIS) and machine learning (ML) techniques for mapping flood risks, which help decision-makers to identify the most vulnerable areas and take the necessary precautions to face this type of natural disaster. To reach this goal, we will study the case of the city of M'sila, which is among the areas most vulnerable to floods. This study drew a map of flood-prone areas based on the methodology where we have made a comparison between 3 machine learning algorithms: the xGboost model, the Random Forest algorithm and the K Nearest Neighbour algorithm. Each of them gave an accuracy respectively of 97.92 - 95 - 93.75. In the process of mapping flood-prone areas, the first model was relied upon, which gave the greatest accuracy (xGboost).

Keywords: Geographic information systems (GIS), machine learning (ML), emergency mapping, flood disaster management

Procedia PDF Downloads 86
5145 The Role of Optimization and Machine Learning in e-Commerce Logistics in 2030

Authors: Vincenzo Capalbo, Gianpaolo Ghiani, Emanuele Manni

Abstract:

Global e-commerce sales have reached unprecedented levels in the past few years. As this trend is only predicted to go up as we continue into the ’20s, new challenges will be faced by companies when planning and controlling e-commerce logistics. In this paper, we survey the related literature on Optimization and Machine Learning as well as on combined methodologies. We also identify the distinctive features of next-generation planning algorithms - namely scalability, model-and-run features and learning capabilities - that will be fundamental to cope with the scale and complexity of logistics in the next decade.

Keywords: e-commerce, hardware acceleration, logistics, machine learning, mixed integer programming, optimization

Procedia PDF Downloads 237
5144 Comparison of Machine Learning Models for the Prediction of System Marginal Price of Greek Energy Market

Authors: Ioannis P. Panapakidis, Marios N. Moschakis

Abstract:

The Greek Energy Market is structured as a mandatory pool where the producers make their bid offers in day-ahead basis. The System Operator solves an optimization routine aiming at the minimization of the cost of produced electricity. The solution of the optimization problem leads to the calculation of the System Marginal Price (SMP). Accurate forecasts of the SMP can lead to increased profits and more efficient portfolio management from the producer`s perspective. Aim of this study is to provide a comparative analysis of various machine learning models such as artificial neural networks and neuro-fuzzy models for the prediction of the SMP of the Greek market. Machine learning algorithms are favored in predictions problems since they can capture and simulate the volatilities of complex time series.

Keywords: deregulated energy market, forecasting, machine learning, system marginal price

Procedia PDF Downloads 207
5143 Volume Density of Power of Multivector Electric Machine

Authors: Aldan A. Sapargaliyev, Yerbol A. Sapargaliyev

Abstract:

Since the invention, the electric machine (EM) can be defined as oEM – one-vector electric machine, as it works due to one-vector inductive coupling with use of one-vector electromagnet. The disadvantages of oEM are large size and limited efficiency at low and medium power applications. This paper describes multi-vector electric machine (mEM) based on multi-vector inductive coupling, which is characterized by the increased surface area of ​​the inductive coupling per EM volume, with a reduced share of inefficient and energy-consuming part of the winding, in comparison with oEM’s. Particularly, it is considered, calculated and compared the performance of three different electrical motors and their power at the same volumes and rotor frequencies. It is also presented the result of calculation of correlation between power density and volume for oEM and mEM. The method of multi-vector inductive coupling enables mEM to possess 1.5-4.0 greater density of power per volume and significantly higher efficiency, in comparison with today’s oEM, especially in low and medium power applications. mEM has distinct advantages, when used in transport vehicles such as electric cars and aircrafts.

Keywords: electric machine, electric motor, electromagnet, efficiency of electric motor

Procedia PDF Downloads 329
5142 Load Balancing Technique for Energy - Efficiency in Cloud Computing

Authors: Rani Danavath, V. B. Narsimha

Abstract:

Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.

Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission

Procedia PDF Downloads 443
5141 Effect the Use of Steel Fibers (Dramix) on Reinforced Concrete Slab

Authors: Faisal Ananda, Junaidi Al-Husein, Oni Febriani, Juli Ardita, N. Indra, Syaari Al-Husein, A. Bukri

Abstract:

Currently, concrete technology continues to grow and continue to innovate one of them using fibers. Fiber concrete has advantages over non-fiber concrete, among others, strong against the effect of shrinkage, ability to reduce crack, fire resistance, etc. In this study, concrete mix design using the procedures listed on SNI 03-2834-2000. The sample used is a cylinder with a height of 30 cm and a width of 15cm in diameter, which is used for compression and tensile testing, while the slab is 400cm x 100cm x 15cm. The fiber used is steel fiber (dramix), with the addition of 2/3 of the thickness of the slabs. The charging is done using a two-point loading. From the result of the research, it is found that the loading of non-fiber slab (0%) of the initial crack is the maximum crack that has passed the maximum crack allowed with a crack width of 1.3 mm with a loading of 1160 kg. The initial crack with the largest load is found on the 1% fiber mixed slab, with the initial crack also being a maximum crack of 0.5mm which also has exceeded the required maximum crack. In the 4% slab the initial crack of 0.1 mm is a minimal initial crack with a load greater than the load of a non-fiber (0%) slab by load1200 kg. While the maximum load on the maximum crack according to the applicable maximum crack conditions, on the 5% fiber mixed slab with a crack width of 0.32mm by loading 1250 kg.

Keywords: crack, dramix, fiber, load, slab

Procedia PDF Downloads 508
5140 A Deep Learning Approach to Subsection Identification in Electronic Health Records

Authors: Nitin Shravan, Sudarsun Santhiappan, B. Sivaselvan

Abstract:

Subsection identification, in the context of Electronic Health Records (EHRs), is identifying the important sections for down-stream tasks like auto-coding. In this work, we classify the text present in EHRs according to their information, using machine learning and deep learning techniques. We initially describe briefly about the problem and formulate it as a text classification problem. Then, we discuss upon the methods from the literature. We try two approaches - traditional feature extraction based machine learning methods and deep learning methods. Through experiments on a private dataset, we establish that the deep learning methods perform better than the feature extraction based Machine Learning Models.

Keywords: deep learning, machine learning, semantic clinical classification, subsection identification, text classification

Procedia PDF Downloads 206
5139 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine

Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li

Abstract:

Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.

Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation

Procedia PDF Downloads 231
5138 Intelligent Production Machine

Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan

Abstract:

This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.

Keywords: cutting process, sound processing, intelligent late, sound analysis

Procedia PDF Downloads 329
5137 A 2-D and 3-D Embroidered Textrode Testing Framework Adhering to ISO Standards

Authors: Komal K., Cleary F., Wells J S.G., Bennett L

Abstract:

Smart fabric garments enable various monitoring applications across sectors such as healthcare, sports and fitness, and the military. Healthcare smart garments monitoring EEG, EMG, and ECG rely on the use of electrodes (dry or wet). However, such electrodes, when used for long-term monitoring, can cause discomfort and skin irritation for the wearer because of their inflexible structure and weight. Ongoing research has been investigating textile-based electrodes (textrodes) in order to provide more comfortable and usable fabric-based electrodes capable of providing intuitive biopotential monitoring. Progress has been made in this space, but they still face a critical design challenge in maintaining consistent skin contact, which directly impacts signal quality. Furthermore, there is a lack of an ISO-based testing framework to validate the electrode design and assess its ability to achieve enhanced performance, strength, usability, and durability. This study proposes the development and evaluation of an ISO-compliant testing framework for standard 2D and advanced 3D embroidered textrodes designs that have a unique structure in order to establish enhanced skin contact for the wearer. This testing framework leverages ISO standards: ISO 13934-1:2013 for tensile and zone-wise strength tests; ISO 13937-2 for tear tests; and ISO 6330 for washing, validating the textrode's performance, a necessity for wearables health parameter monitoring applications. Five textrodes (C1-C5) were designed using EPC win digitization software. Varying patterns such as running stitches, lock stitches, back-to-back stitches, and moss stitches were used to create various embroidered tetrodes samples using Madeira HC12 conductive thread with a resistivity of 100 ohm/m. The textrode designs were then fabricated using a ZSK technical embroidery machine. A comparative analysis was conducted based on a series of laboratory tests adhering to ISO compliance requirements. Tests focusing on the application of strain were applied to the textrodes, and these included: (1) analysis of the electrode's overall surface area strength; (2) assessment of the robustness of the textrodes boundaries; and (3) the assignment of fault test zones to each textrode, where vertical and horizontal slits of 3mm were applied to evaluate the performance of textrodes and its durability. Specific ISO-compliant tests linked to washing were conducted multiple times on each textrode sample to assess both mechanical and chemical damage. Additionally, abrasion and pilling tests were performed to evaluate mechanical damage on the surface of the textrodes and to compare it with the washing test. Finally, the textrodes were assessed based on morphological and surface resistance changes. Results demonstrate that textrode C4, featuring a 3-D layered structure consisting of foam, fabric, and conductive thread layers, significantly enhances skin-electrode contact for biopotential recording. The inclusion of a 3D foam layer was particularly effective in maintaining the shape of the electrode during strain tests, making it the top-performing textrode sample. Therefore, the layered 3D design structure of textrode C4 ranks highest when tested for durability, reusability, and washability. The ISO testing framework established in this study will support future research, validating the durability and reliability of textrodes for a wide range of applications.

Keywords: smart fabric, textrodes, testing framework, ISO compliant

Procedia PDF Downloads 73
5136 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms

Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang

Abstract:

Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.

Keywords: bioassay, machine learning, preprocessing, virtual screen

Procedia PDF Downloads 268
5135 The Accuracy of Parkinson's Disease Diagnosis Using [123I]-FP-CIT Brain SPECT Data with Machine Learning Techniques: A Survey

Authors: Lavanya Madhuri Bollipo, K. V. Kadambari

Abstract:

Objective: To discuss key issues in the diagnosis of Parkinson disease (PD), To discuss features influencing PD progression, To discuss importance of brain SPECT data in PD diagnosis, and To discuss the essentiality of machine learning techniques in early diagnosis of PD. An accurate and early diagnosis of PD is nowadays a challenge as clinical symptoms in PD arise only when there is more than 60% loss of dopaminergic neurons. So far there are no laboratory tests for the diagnosis of PD, causing a high rate of misdiagnosis especially when the disease is in the early stages. Recent neuroimaging studies with brain SPECT using 123I-Ioflupane (DaTSCAN) as radiotracer shown to be widely used to assist the diagnosis of PD even in its early stages. Machine learning techniques can be used in combination with image analysis procedures to develop computer-aided diagnosis (CAD) systems for PD. This paper addressed recent studies involving diagnosis of PD in its early stages using brain SPECT data with Machine Learning Techniques.

Keywords: Parkinson disease (PD), dopamine transporter, single-photon emission computed tomography (SPECT), support vector machine (SVM)

Procedia PDF Downloads 389
5134 Effects of Different Load on Physiological, Hematological, Biochemical, Cytokines Indices of Zanskar Ponies at High Altitude

Authors: Prince Vivek, Vijay Kumar Bharti, Deepak Kumar, Rohit Kumar, Kapil Nehra, Dhananjay Singh, Om Prakash Chaurasia, Bhuvnesh Kumar

Abstract:

High altitude native people still rely heavily on animal transport for logistic support at eastern and northern Himalayas regions. The prevalent mountainous terrains and rugged region are not suitable for the motorized vehicle to use in logistic transport. Therefore, people required high endurance pack animals for load carrying and riding. So far to the best of our knowledge, no studies have been taken to evaluate the effect of loads on the physiology of ponies in high altitude region. So, in this view, we evaluated variation in physiological, hematological, biochemical, and cytokines indices of Zanskar ponies during load carrying at high altitude. Total twelve (12) of Zanskar ponies, mare, age 4-6 years selected for this study, Feed was offered at 2% of body weight, and water ad libitum. Ponies were divided into three groups; group-A (without load), group-B (60 kg), and group-C (80 kg) of backpack loads. The track was very narrow and slippery with gravel, uneven with a rocky surface and has a steep gradient of 4 km uphill at altitude 3291 to 3500m. When we evaluate these parameters, it is understood that the heart rate, pulse rate, and respiration rate was significantly increased in 80 kg group among the three groups. The hematology parameters viz. hemoglobin significantly increased in 80 kg group on 1st day after load carrying among the three groups which was followed by control and 60 kg whereas, PCV, lymphocytes, monocytes percentage significantly increased however, ESR and eosinophil % significantly decreased in 80 kg group after load carrying on 7th day after load carrying among the three groups which were followed by control and 60 kg group. In biochemical parameters viz. LA, LDH, TP, hexokinase (HK), cortisol (CORT), T3, GPx, FRAP and IL-6 significantly increased in 80 kg group on the 7th day after load carrying among the three groups which were followed by control and 60 kg group. The ALT, ALB, GLB, UR, and UA significantly increased in 80 kg group on the 7th day before and after load carrying among the three groups which were followed by control and 60 kg group. The CRT, AST, and CK-MB were significantly increased in 80 kg group on the 1st and 7th day after load carrying among the three groups which were followed by control and 60 kg group. It has been concluded that, heart rate, respiration rate, hematological indices like PCV, lymphocytes, monocytes, Hb and ESR, biochemical indices like lactic acid, LDH, TP, HK, CORT, T3, ALT, AST and CRT, ALB, GLB, UR, UA, GPx, FRAP and IL-6 are important biomarkers to assess effect of load on animal physiology and endurance. Further, this result has revealed the strong correlation of change in biomarkers level with performance in ponies during load carry. Hence, these parameters might be used for the performance of endurance of Zanskar ponies in the high mountain region.

Keywords: biochemical, endurance, high altitude, load, ponies

Procedia PDF Downloads 274
5133 Numerical Study of Modulus of Subgrade Reaction in Eccentrically Loaded Circular Footing Resting

Authors: Seyed Abolhasan Naeini, Mohammad Hossein Zade

Abstract:

This article is an attempt to present a numerically study of the behaviour of an eccentrically loaded circular footing resting on sand to determine ‎its ultimate bearing capacity. A surface circular footing of diameter 12 cm (D) was used as ‎shallow foundation. For this purpose, three dimensional models consist of foundation, and medium sandy soil was modelled by ABAQUS software. Bearing capacity of footing was evaluated and the ‎effects of the load eccentricity on bearing capacity, its settlement, and modulus of subgrade reaction were studied. Three different values of load eccentricity with equal space from inside the core on the core boundary and outside the core boundary, which were respectively e=0.75, 1.5, and 2.25 cm, were considered. The results show that by increasing the load eccentricity, the ultimate load and the ‎modulus of subgrade reaction decreased.

Keywords: circular foundation, sand, eccentric loading, modulus of subgrade reaction

Procedia PDF Downloads 341
5132 Reliability Based Analysis of Multi-Lane Reinforced Concrete Slab Bridges

Authors: Ali Mahmoud, Shadi Najjar, Mounir Mabsout, Kassim Tarhini

Abstract:

Empirical expressions for estimating the wheel load distribution and live-load bending moment are typically specified in highway bridge codes such as the AASHTO procedures. The purpose of this paper is to analyze the reliability levels that are inherent in reinforced concrete slab bridges that are designed based on the simplified empirical live load equations in the AASHTO LRFD procedures. To achieve this objective, bridges with multi-lanes (three and four lanes) and different spans are modeled using finite-element analysis (FEA) subjected to HS20 truck loading, tandem loading, and standard lane loading per AASHTO LRFD procedures. The FEA results are compared with the AASHTO LRFD moments in order to quantify the biases that might result from the simplifying assumptions adopted in AASHTO. A reliability analysis is conducted to quantify the reliability index for bridges designed using AASHTO procedures. To reach a consistent level of safety for three- and four-lane bridges, following a previous study restricted to one- and two-lane bridges, the live load factor in the design equation proposed by AASHTO LRFD will be assessed and revised if needed by alternating the live load factor for these lanes. The results will provide structural engineers with more consistent provisions to design concrete slab bridges or evaluate the load-carrying capacity of existing bridges.

Keywords: reliability analysis of concrete bridges, finite element modeling, reliability analysis, reinforced concrete bridge design, load carrying capacity

Procedia PDF Downloads 333
5131 Fundamental Research Dissension between Hot and Cold Chamber High Pressure Die Casting

Authors: Sahil Kumar, Surinder Pal, Rahul Kapoor

Abstract:

This paper is focused on to define the basic difference between hot and cold chamber high pressure die casting process which is not fully defined in a research before paper which we have studied. The pressure die casting is basically defined into two types (1) Hot chamber Die Casting (2) Cold chamber Die Casting. Cold chamber die casting is used for casting alloys that require high pressure and have a high melting temperature, such as brass, aluminum, magnesium, copper based alloys and other high melting point nonferrous alloys. Hot chamber die casting is suitable for casting zinc, tin, lead, and low melting point alloys. In hot chamber die casting machine, the molten metal is an integral pan of the machine. It mainly consists of hot chamber and gooseneck type metal container made of cast iron. This machine is mainly used for low melting alloys and alloys of metals like zinc, lead etc. Metals and alloys having a high melting point and those which are having an affinity for iron cannot be cast by this machine, which could otherwise attack the shot sleeve and damage the machine.

Keywords: hot chamber die casting, cold chamber die casting, metals and alloys, casting technology

Procedia PDF Downloads 613
5130 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform

Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry

Abstract:

The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.

Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems

Procedia PDF Downloads 482