Search results for: discriminate accuracy
1038 A Comparative Study for Various Techniques Using WEKA for Red Blood Cells Classification
Authors: Jameela Ali, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy
Abstract:
Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifyig the red blood cells as normal, or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithm tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital-Malaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectivelyKeywords: red blood cells, classification, radial basis function neural networks, suport vector machine, k-nearest neighbors algorithm
Procedia PDF Downloads 4811037 Classification for Obstructive Sleep Apnea Syndrome Based on Random Forest
Authors: Cheng-Yu Tsai, Wen-Te Liu, Shin-Mei Hsu, Yin-Tzu Lin, Chi Wu
Abstract:
Background: Obstructive Sleep apnea syndrome (OSAS) is a common respiratory disorder during sleep. In addition, Body parameters were identified high predictive importance for OSAS severity. However, the effects of body parameters on OSAS severity remain unclear. Objective: In this study, the objective is to establish a prediction model for OSAS by using body parameters and investigate the effects of body parameters in OSAS. Methodologies: Severity was quantified as the polysomnography and the mean hourly number of greater than 3% dips in oxygen saturation during examination in a hospital in New Taipei City (Taiwan). Four levels of OSAS severity were classified by the apnea and hypopnea index (AHI) with American Academy of Sleep Medicine (AASM) guideline. Body parameters, including neck circumference, waist size, and body mass index (BMI) were obtained from questionnaire. Next, dividing the collecting subjects into two groups: training and testing groups. The training group was used to establish the random forest (RF) to predicting, and test group was used to evaluated the accuracy of classification. Results: There were 3330 subjects recruited in this study, whom had been done polysomnography for evaluating severity for OSAS. A RF of 1000 trees achieved correctly classified 79.94 % of test cases. When further evaluated on the test cohort, RF showed the waist and BMI as the high import factors in OSAS. Conclusion It is possible to provide patient with prescreening by body parameters which can pre-evaluate the health risks.Keywords: apnea and hypopnea index, Body parameters, obstructive sleep apnea syndrome, Random Forest
Procedia PDF Downloads 1561036 Designing of Tooling Solution for Material Handling in Highly Automated Manufacturing System
Authors: Muhammad Umair, Yuri Nikolaev, Denis Artemov, Ighor Uzhinsky
Abstract:
A flexible manufacturing system is an integral part of a smart factory of industry 4.0 in which every machine is interconnected and works autonomously. Robots are in the process of replacing humans in every industrial sector. As the cyber-physical-system (CPS) and artificial intelligence (AI) are advancing, the manufacturing industry is getting more dependent on computers than human brains. This modernization has boosted the production with high quality and accuracy and shifted from classic production to smart manufacturing systems. However, material handling for such automated productions is a challenge and needs to be addressed with the best possible solution. Conventional clamping systems are designed for manual work and not suitable for highly automated production systems. Researchers and engineers are trying to find the most economical solution for loading/unloading and transportation workpieces from a warehouse to a machine shop for machining operations and back to the warehouse without human involvement. This work aims to propose an advanced multi-shape tooling solution for highly automated manufacturing systems. The currently obtained result shows that it could function well with automated guided vehicles (AGVs) and modern conveyor belts. The proposed solution is following requirements to be automation-friendly, universal for different part geometry and production operations. We used a bottom-up approach in this work, starting with studying different case scenarios and their limitations and finishing with the general solution.Keywords: artificial intelligence, cyber physics system, Industry 4.0, material handling, smart factory, flexible manufacturing system
Procedia PDF Downloads 1351035 An Criterion to Minimize FE Mesh-Dependency in Concrete Plate Subjected to Impact Loading
Authors: Kwak, Hyo-Gyung, Gang, Han Gul
Abstract:
In the context of an increasing need for reliability and safety in concrete structures under blast and impact loading condition, the behavior of concrete under high strain rate condition has been an important issue. Since concrete subjected to impact loading associated with high strain rate shows quite different material behavior from that in the static state, several material models are proposed and used to describe the high strain rate behavior under blast and impact loading. In the process of modelling, in advance, mesh dependency in the used finite element (FE) is the key problem because simulation results under high strain-rate condition are quite sensitive to applied FE mesh size. It means that the accuracy of simulation results may deeply be dependent on FE mesh size in simulations. This paper introduces an improved criterion which can minimize the mesh-dependency of simulation results on the basis of the fracture energy concept, and HJC (Holmquist Johnson Cook), CSC (Continuous Surface Cap) and K&C (Karagozian & Case) models are examined to trace their relative sensitivity to the used FE mesh size. To coincide with the purpose of the penetration test with a concrete plate under a projectile (bullet), the residual velocities of projectile after penetration are compared. The correlation studies between analytical results and the parametric studies associated with them show that the variation of residual velocity with the used FE mesh size is quite reduced by applying a unique failure strain value determined according to the proposed criterion.Keywords: high strain rate concrete, penetration simulation, failure strain, mesh-dependency, fracture energy
Procedia PDF Downloads 5251034 The Influence of Air Temperature Controls in Estimation of Air Temperature over Homogeneous Terrain
Authors: Fariza Yunus, Jasmee Jaafar, Zamalia Mahmud, Nurul Nisa’ Khairul Azmi, Nursalleh K. Chang, Nursalleh K. Chang
Abstract:
Variation of air temperature from one place to another is cause by air temperature controls. In general, the most important control of air temperature is elevation. Another significant independent variable in estimating air temperature is the location of meteorological stations. Distances to coastline and land use type are also contributed to significant variations in the air temperature. On the other hand, in homogeneous terrain direct interpolation of discrete points of air temperature work well to estimate air temperature values in un-sampled area. In this process the estimation is solely based on discrete points of air temperature. However, this study presents that air temperature controls also play significant roles in estimating air temperature over homogenous terrain of Peninsular Malaysia. An Inverse Distance Weighting (IDW) interpolation technique was adopted to generate continuous data of air temperature. This study compared two different datasets, observed mean monthly data of T, and estimation error of T–T’, where T’ estimated value from a multiple regression model. The multiple regression model considered eight independent variables of elevation, latitude, longitude, coastline, and four land use types of water bodies, forest, agriculture and build up areas, to represent the role of air temperature controls. Cross validation analysis was conducted to review accuracy of the estimation values. Final results show, estimation values of T–T’ produced lower errors for mean monthly mean air temperature over homogeneous terrain in Peninsular Malaysia.Keywords: air temperature control, interpolation analysis, peninsular Malaysia, regression model, air temperature
Procedia PDF Downloads 3771033 A Low Order Thermal Envelope Model for Heat Transfer Characteristics of Low-Rise Residential Buildings
Authors: Nadish Anand, Richard D. Gould
Abstract:
A simplistic model is introduced for determining the thermal characteristics of a Low-rise Residential (LRR) building and then predicts the energy usage by its Heating Ventilation & Air Conditioning (HVAC) system according to changes in weather conditions which are reflected in the Ambient Temperature (Outside Air Temperature). The LRR buildings are treated as a simple lump for solving the heat transfer problem and the model is derived using the lumped capacitance model of transient conduction heat transfer from bodies. Since most contemporary HVAC systems have a thermostat control which will have an offset temperature and user defined set point temperatures which define when the HVAC system will switch on and off. The aim is to predict without any error the Body Temperature (i.e. the Inside Air Temperature) which will estimate the switching on and off of the HVAC system. To validate the mathematical model derived from lumped capacitance we have used EnergyPlus simulation engine, which simulates Buildings with considerable accuracy. We have predicted through the low order model the Inside Air Temperature of a single house kept in three different climate zones (Detroit, Raleigh & Austin) and different orientations for summer and winter seasons. The prediction error from the model for the same day as that of model parameter calculation has showed an error of < 10% in winter for almost all the orientations and climate zones. Whereas the prediction error is only <10% for all the orientations in the summer season for climate zone at higher latitudes (Raleigh & Detroit). Possible factors responsible for the large variations are also noted in the work, paving way for future research.Keywords: building energy, energy consumption, energy+, HVAC, low order model, lumped capacitance
Procedia PDF Downloads 2701032 High Aspect Ratio Micropillar Array Based Microfluidic Viscometer
Authors: Ahmet Erten, Adil Mustafa, Ayşenur Eser, Özlem Yalçın
Abstract:
We present a new viscometer based on a microfluidic chip with elastic high aspect ratio micropillar arrays. The displacement of pillar tips in flow direction can be used to analyze viscosity of liquid. In our work, Computational Fluid Dynamics (CFD) is used to analyze pillar displacement of various micropillar array configurations in flow direction at different viscosities. Following CFD optimization, micro-CNC based rapid prototyping is used to fabricate molds for microfluidic chips. Microfluidic chips are fabricated out of polydimethylsiloxane (PDMS) using soft lithography methods with molds machined out of aluminum. Tip displacements of micropillar array (300 µm in diameter and 1400 µm in height) in flow direction are recorded using a microscope mounted camera, and the displacements are analyzed using image processing with an algorithm written in MATLAB. Experiments are performed with water-glycerol solutions mixed at 4 different ratios to attain 1 cP, 5 cP, 10 cP and 15 cP viscosities at room temperature. The prepared solutions are injected into the microfluidic chips using a syringe pump at flow rates from 10-100 mL / hr and the displacement versus flow rate is plotted for different viscosities. A displacement of around 1.5 µm was observed for 15 cP solution at 60 mL / hr while only a 1 µm displacement was observed for 10 cP solution. The presented viscometer design optimization is still in progress for better sensitivity and accuracy. Our microfluidic viscometer platform has potential for tailor made microfluidic chips to enable real time observation and control of viscosity changes in biological or chemical reactions.Keywords: Computational Fluid Dynamics (CFD), high aspect ratio, micropillar array, viscometer
Procedia PDF Downloads 2481031 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection
Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew
Abstract:
The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.
Procedia PDF Downloads 511030 Unlocking Green Hydrogen Potential: A Machine Learning-Based Assessment
Authors: Said Alshukri, Mazhar Hussain Malik
Abstract:
Green hydrogen is hydrogen produced using renewable energy sources. In the last few years, Oman aimed to reduce its dependency on fossil fuels. Recently, the hydrogen economy has become a global trend, and many countries have started to investigate the feasibility of implementing this sector. Oman created an alliance to establish the policy and rules for this sector. With motivation coming from both global and local interest in green hydrogen, this paper investigates the potential of producing hydrogen from wind and solar energies in three different locations in Oman, namely Duqm, Salalah, and Sohar. By using machine learning-based software “WEKA” and local metrological data, the project was designed to figure out which location has the highest wind and solar energy potential. First, various supervised models were tested to obtain their prediction accuracy, and it was found that the Random Forest (RF) model has the best prediction performance. The RF model was applied to 2021 metrological data for each location, and the results indicated that Duqm has the highest wind and solar energy potential. The system of one wind turbine in Duqm can produce 8335 MWh/year, which could be utilized in the water electrolysis process to produce 88847 kg of hydrogen mass, while a solar system consisting of 2820 solar cells is estimated to produce 1666.223 MWh/ year which is capable of producing 177591 kg of hydrogen mass.Keywords: green hydrogen, machine learning, wind and solar energies, WEKA, supervised models, random forest
Procedia PDF Downloads 801029 Investigating the Effectiveness of a 3D Printed Composite Mold
Authors: Peng Hao Wang, Garam Kim, Ronald Sterkenburg
Abstract:
In composite manufacturing, the fabrication of tooling and tooling maintenance contributes to a large portion of the total cost. However, as the applications of composite materials continue to increase, there is also a growing demand for more tooling. The demand for more tooling places heavy emphasis on the industry’s ability to fabricate high quality tools while maintaining the tool’s cost effectiveness. One of the popular techniques of tool fabrication currently being developed utilizes additive manufacturing technology known as 3D printing. The popularity of 3D printing is due to 3D printing’s ability to maintain low material waste, low cost, and quick fabrication time. In this study, a team of Purdue University School of Aviation and Transportation Technology (SATT) faculty and students investigated the effectiveness of a 3D printed composite mold. A steel valve cover from an aircraft reciprocating engine was modeled utilizing 3D scanning and computer-aided design (CAD) to create a 3D printed composite mold. The mold was used to fabricate carbon fiber versions of the aircraft reciprocating engine valve cover. The carbon fiber valve covers were evaluated for dimensional accuracy and quality while the 3D printed composite mold was evaluated for durability and dimensional stability. The data collected from this study provided valuable information in the understanding of 3D printed composite molds, potential improvements for the molds, and considerations for future tooling design.Keywords: additive manufacturing, carbon fiber, composite tooling, molds
Procedia PDF Downloads 1151028 Thermal Hysteresis Activity of Ice Binding Proteins during Ice Crystal Growth in Sucrose Solution
Authors: Bercem Kiran-Yildirim, Volker Gaukel
Abstract:
Ice recrystallization (IR) which occurs especially during frozen storage is an undesired process due to the possible influence on the quality of products. As a result of recrystallization, the total volume of ice remains constant, but the size, number, and shape of ice crystals change. For instance, as indicated in the literature, the size of ice crystals in ice cream increases due to recrystallization. This results in texture deterioration. Therefore, the inhibition of ice recrystallization is of great importance, not only for food industry but also for several other areas where sensitive products are stored frozen, like pharmaceutical products or organs and blood in medicine. Ice-binding proteins (IBPs) have the unique ability to inhibit ice growth and in consequence inhibit recrystallization. This effect is based on their ice binding affinity. In the presence of IBP in a solution, ice crystal growth is inhibited during temperature decrease until a certain temperature is reached. The melting during temperature increase is not influenced. The gap between melting and freezing points is known as thermal hysteresis (TH). In literature, the TH activity is usually investigated under laboratory conditions in IBP buffer solutions. In product applications (e.g., food) there are many other solutes present which may influence the TH activity. In this study, a subset of IBPs, so-called antifreeze proteins (AFPs), is used for the investigation of the influence of sucrose solution concentration on the TH activity. For the investigation, a polarization microscope (Nikon Eclipse LV100ND) equipped with a digital camera (Nikon DS-Ri1) and a cold stage (Linkam LTS420) was used. In a first step, the equipment was established and validated concerning the accuracy of TH measurements based on literature data.Keywords: ice binding proteins, ice crystals, sucrose solution, thermal hysteresis
Procedia PDF Downloads 1881027 The Posthuman Condition and a Translational Ethics of Entanglement
Authors: Shabnam Naderi
Abstract:
Traditional understandings of ethics considered translators, translations, technologies and other agents as separate and prioritized human agents. In fact, ethics was equated with morality. This disengaged understanding of ethics is superseded by an ethics of relation/entanglement in the posthuman philosophy. According to this ethics of entanglement, human and nonhuman agents are in constant ‘intra-action’. The human is not separate from nature, from technology and from other nonhuman entities, and an ethics of translation in this regard cannot be separated from technology and ecology and get defined merely within the realm of human-human encounter. As such, a posthuman ethics offers opportunities for change and responds to the changing nature of reality, it is negotiable and reveals itself as a moment-by-moment practice (i.e. as temporally emergent and beyond determinacy and permanence). Far from the linguistic or cultural, or individual concerns, posthuman translational ethics discusses how the former rigid norms and laws are challenged in a process ontology which puts emphasis on activity and activation and considers ethics as surfacing in activity, not as a predefined set of rules and values. In this sense, traditional ethical principles like faithfulness, accuracy and representation are superseded by principles of privacy, sustainability, multiplicity and decentralization. The present conceptual study, drawing on Ferrando’s philosophical posthumanism (as a post-humanism, as a post-dualism and as a post-anthropocentrism), Deleuze-Guattarian philosophy of immanence and Barad’s physics-philosophy strives to destabilize traditional understandings of translation ethics and bring an ethics that has loose ends and revolves around multiplicity and decentralization into the picture.Keywords: ethics of entanglement, post-anthropocentrism, post-dualism, post-humanism, translation
Procedia PDF Downloads 811026 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence
Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno
Abstract:
Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index
Procedia PDF Downloads 1701025 HLB Disease Detection in Omani Lime Trees using Hyperspectral Imaging Based Techniques
Authors: Jacintha Menezes, Ramalingam Dharmalingam, Palaiahnakote Shivakumara
Abstract:
In the recent years, Omani acid lime cultivation and production has been affected by Citrus greening or Huanglongbing (HLB) disease. HLB disease is one of the most destructive diseases for citrus, with no remedies or countermeasures to stop the disease. Currently used Polymerase chain reaction (PCR) and enzyme-linked immunosorbent assay (ELISA) HLB detection tests require lengthy and labor-intensive laboratory procedures. Furthermore, the equipment and staff needed to carry out the laboratory procedures are frequently specialized hence making them a less optimal solution for the detection of the disease. The current research uses hyperspectral imaging technology for automatic detection of citrus trees with HLB disease. Omani citrus tree leaf images were captured through portable Specim IQ hyperspectral camera. The research considered healthy, nutrition deficient, and HLB infected leaf samples based on the Polymerase chain reaction (PCR) test. The highresolution image samples were sliced to into sub cubes. The sub cubes were further processed to obtain RGB images with spatial features. Similarly, RGB spectral slices were obtained through a moving window on the wavelength. The resized spectral-Spatial RGB images were given to Convolution Neural Networks for deep features extraction. The current research was able to classify a given sample to the appropriate class with 92.86% accuracy indicating the effectiveness of the proposed techniques. The significant bands with a difference in three types of leaves are found to be 560nm, 678nm, 726 nm and 750nm.Keywords: huanglongbing (HLB), hyperspectral imaging (HSI), · omani citrus, CNN
Procedia PDF Downloads 831024 Displacement Solution for a Static Vertical Rigid Movement of an Interior Circular Disc in a Transversely Isotropic Tri-Material Full-Space
Authors: D. Mehdizadeh, M. Rahimian, M. Eskandari-Ghadi
Abstract:
This article is concerned with the determination of the static interaction of a vertically loaded rigid circular disc embedded at the interface of a horizontal layer sandwiched in between two different transversely isotropic half-spaces called as tri-material full-space. The axes of symmetry of different regions are assumed to be normal to the horizontal interfaces and parallel to the movement direction. With the use of a potential function method, and by implementing Hankel integral transforms in the radial direction, the government partial differential equation for the solely scalar potential function is transformed to an ordinary 4th order differential equation, and the mixed boundary conditions are transformed into a pair of integral equations called dual integral equations, which can be reduced to a Fredholm integral equation of the second kind, which is solved analytically. Then, the displacements and stresses are given in the form of improper line integrals, which is due to inverse Hankel integral transforms. It is shown that the present solutions are in exact agreement with the existing solutions for a homogeneous full-space with transversely isotropic material. To confirm the accuracy of the numerical evaluation of the integrals involved, the numerical results are compared with the solutions exists for the homogeneous full-space. Then, some different cases with different degrees of material anisotropy are compared to portray the effect of degree of anisotropy.Keywords: transversely isotropic, rigid disc, elasticity, dual integral equations, tri-material full-space
Procedia PDF Downloads 4421023 A Low-Power Two-Stage Seismic Sensor Scheme for Earthquake Early Warning System
Authors: Arvind Srivastav, Tarun Kanti Bhattacharyya
Abstract:
The north-eastern, Himalayan, and Eastern Ghats Belt of India comprise of earthquake-prone, remote, and hilly terrains. Earthquakes have caused enormous damages in these regions in the past. A wireless sensor network based earthquake early warning system (EEWS) is being developed to mitigate the damages caused by earthquakes. It consists of sensor nodes, distributed over the region, that perform majority voting of the output of the seismic sensors in the vicinity, and relay a message to a base station to alert the residents when an earthquake is detected. At the heart of the EEWS is a low-power two-stage seismic sensor that continuously tracks seismic events from incoming three-axis accelerometer signal at the first-stage, and, in the presence of a seismic event, triggers the second-stage P-wave detector that detects the onset of P-wave in an earthquake event. The parameters of the P-wave detector have been optimized for minimizing detection time and maximizing the accuracy of detection.Working of the sensor scheme has been verified with seven earthquakes data retrieved from IRIS. In all test cases, the scheme detected the onset of P-wave accurately. Also, it has been established that the P-wave onset detection time reduces linearly with the sampling rate. It has been verified with test data; the detection time for data sampled at 10Hz was around 2 seconds which reduced to 0.3 second for the data sampled at 100Hz.Keywords: earthquake early warning system, EEWS, STA/LTA, polarization, wavelet, event detector, P-wave detector
Procedia PDF Downloads 1801022 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process
Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum
Abstract:
Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact
Procedia PDF Downloads 2001021 Large Eddy Simulation of Hydrogen Deflagration in Open Space and Vented Enclosure
Authors: T. Nozu, K. Hibi, T. Nishiie
Abstract:
This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.Keywords: deflagration, large eddy simulation, turbulent combustion, vented enclosure
Procedia PDF Downloads 2461020 Transformer Fault Diagnostic Predicting Model Using Support Vector Machine with Gradient Decent Optimization
Authors: R. O. Osaseri, A. R. Usiobaifo
Abstract:
The power transformer which is responsible for the voltage transformation is of great relevance in the power system and oil-immerse transformer is widely used all over the world. A prompt and proper maintenance of the transformer is of utmost importance. The dissolved gasses content in power transformer, oil is of enormous importance in detecting incipient fault of the transformer. There is a need for accurate prediction of the incipient fault in transformer oil in order to facilitate the prompt maintenance and reducing the cost and error minimization. Study on fault prediction and diagnostic has been the center of many researchers and many previous works have been reported on the use of artificial intelligence to predict incipient failure of transformer faults. In this study machine learning technique was employed by using gradient decent algorithms and Support Vector Machine (SVM) in predicting incipient fault diagnosis of transformer. The method focuses on creating a system that improves its performance on previous result and historical data. The system design approach is basically in two phases; training and testing phase. The gradient decent algorithm is trained with a training dataset while the learned algorithm is applied to a set of new data. This two dataset is used to prove the accuracy of the proposed model. In this study a transformer fault diagnostic model based on Support Vector Machine (SVM) and gradient decent algorithms has been presented with a satisfactory diagnostic capability with high percentage in predicting incipient failure of transformer faults than existing diagnostic methods.Keywords: diagnostic model, gradient decent, machine learning, support vector machine (SVM), transformer fault
Procedia PDF Downloads 3281019 Count of Trees in East Africa with Deep Learning
Authors: Nubwimana Rachel, Mugabowindekwe Maurice
Abstract:
Trees play a crucial role in maintaining biodiversity and providing various ecological services. Traditional methods of counting trees are time-consuming, and there is a need for more efficient techniques. However, deep learning makes it feasible to identify the multi-scale elements hidden in aerial imagery. This research focuses on the application of deep learning techniques for tree detection and counting in both forest and non-forest areas through the exploration of the deep learning application for automated tree detection and counting using satellite imagery. The objective is to identify the most effective model for automated tree counting. We used different deep learning models such as YOLOV7, SSD, and UNET, along with Generative Adversarial Networks to generate synthetic samples for training and other augmentation techniques, including Random Resized Crop, AutoAugment, and Linear Contrast Enhancement. These models were trained and fine-tuned using satellite imagery to identify and count trees. The performance of the models was assessed through multiple trials; after training and fine-tuning the models, UNET demonstrated the best performance with a validation loss of 0.1211, validation accuracy of 0.9509, and validation precision of 0.9799. This research showcases the success of deep learning in accurate tree counting through remote sensing, particularly with the UNET model. It represents a significant contribution to the field by offering an efficient and precise alternative to conventional tree-counting methods.Keywords: remote sensing, deep learning, tree counting, image segmentation, object detection, visualization
Procedia PDF Downloads 801018 The Application and Relevance of Costing Techniques in Service Oriented Business Organisations: A Review of the Activity-Based Costing (ABC) Technique
Authors: Udeh Nneka Evelyn
Abstract:
The shortcomings of traditional costing system, in terms of validity, accuracy, consistency and relevance increased the need for modern management accounting system. ABC (Activity-Based Costing) can be used as a modern tool for planning, control and decision making for management. Past studies on activity-based costing (ABC) system have focused on manufacturing firms thereby making the studies on service firms scanty to some extent. This paper reviewed the application and relevance of activity-based costing techniques in service oriented business organisations by employing a qualitative research method which relied heavily on literature review of past and current relevant articles focusing on activity-based costing (ABC). Findings suggest that ABC is not only appropriate for use in a manufacturing environment; it is also most appropriate for service organizations such as financial institutions, the healthcare industry, and government organizations. In fact, some banking and financial institutions have been applying the concept for years under other names. One of them is unit costing, which is used to calculate the cost of banking services by determining the cost and consumption of each unit of output of functions required to deliver the service. ABC in very basic terms may provide very good payback for businesses. Some of the benefits that relate directly to the financial services industry are: Identification of the most profitable customers; more accurate product and service pricing; increase product profitability; well-organized process costs.Keywords: profitability, activity-based costing (ABC), management accounting, manufacture
Procedia PDF Downloads 5821017 Advancements in Laser Welding Process: A Comprehensive Model for Predictive Geometrical, Metallurgical, and Mechanical Characteristics
Authors: Seyedeh Fatemeh Nabavi, Hamid Dalir, Anooshiravan Farshidianfar
Abstract:
Laser welding is pivotal in modern manufacturing, offering unmatched precision, speed, and efficiency. Its versatility in minimizing heat-affected zones, seamlessly joining dissimilar materials, and working with various metals makes it indispensable for crafting intricate automotive components. Integration into automated systems ensures consistent delivery of high-quality welds, thereby enhancing overall production efficiency. Noteworthy are the safety benefits of laser welding, including reduced fumes and consumable materials, which align with industry standards and environmental sustainability goals. As the automotive sector increasingly demands advanced materials and stringent safety and quality standards, laser welding emerges as a cornerstone technology. A comprehensive model encompassing thermal dynamic and characteristics models accurately predicts geometrical, metallurgical, and mechanical aspects of the laser beam welding process. Notably, Model 2 showcases exceptional accuracy, achieving remarkably low error rates in predicting primary and secondary dendrite arm spacing (PDAS and SDAS). These findings underscore the model's reliability and effectiveness, providing invaluable insights and predictive capabilities crucial for optimizing welding processes and ensuring superior productivity, efficiency, and quality in the automotive industry.Keywords: laser welding process, geometrical characteristics, mechanical characteristics, metallurgical characteristics, comprehensive model, thermal dynamic
Procedia PDF Downloads 531016 The Influence of Environmental Factors on Honey Bee Activities: A Quantitative Analysis
Authors: Hung-Jen Lin, Chien-Hao Wang, Chien-Peng Huang, Yu-Sheng Tseng, En-Cheng Yang, Joe-Air Jiang
Abstract:
Bees’ incoming and outgoing behavior is a decisive index which can indicate the health condition of a colony. Traditional methods for monitoring the behavior of honey bees (Apis mellifera) take too much time and are highly labor-intensive, and the lack of automation and synchronization disables researchers and beekeepers from obtaining real-time information of beehives. To solve these problems, this study proposes to use an Internet of Things (IoT)-based system for counting honey bees’ incoming and outgoing activities using an infrared interruption technique, while environmental factors are recorded simultaneously. The accuracy of the established system is verified by comparing the counting results with the outcomes of manual counting. Moreover, this highly -accurate device is appropriate for providing quantitative information regarding honey bees’ incoming and outgoing behavior. Different statistical analysis methods, including one-way ANOVA and two-way ANOVA, are used to investigate the influence of environmental factors, such as temperature, humidity, illumination and ambient pressure, on bees’ incoming and outgoing behavior. With the real-time data, a standard model is established using the outcomes from analyzing the relationship between environmental factors and bees’ incoming and outgoing behavior. In the future, smart control systems, such as a temperature control system, can also be combined with the proposed system to create an appropriate colony environment. It is expected that the proposed system will make a considerable contribution to the apiculture and researchers.Keywords: ANOVA, environmental factors, honey bee, incoming and outgoing behavior
Procedia PDF Downloads 3721015 The Impact of Window Opening Occupant Behavior Models on Building Energy Performance
Authors: Habtamu Tkubet Ebuy
Abstract:
Purpose Conventional dynamic energy simulation tools go beyond the static dimension of simplified methods by providing better and more accurate prediction of building performance. However, their ability to forecast actual performance is undermined by a low representation of human interactions. The purpose of this study is to examine the potential benefits of incorporating information on occupant diversity into occupant behavior models used to simulate building performance. The co-simulation of the stochastic behavior of the occupants substantially increases the accuracy of the simulation. Design/methodology/approach In this article, probabilistic models of the "opening and closing" behavior of the window of inhabitants have been developed in a separate multi-agent platform, SimOcc, and implemented in the building simulation, TRNSYS, in such a way that the behavior of the window with the interconnectivity can be reflected in the simulation analysis of the building. Findings The results of the study prove that the application of complex behaviors is important to research in predicting actual building performance. The results aid in the identification of the gap between reality and existing simulation methods. We hope this study and its results will serve as a guide for researchers interested in investigating occupant behavior in the future. Research limitations/implications Further case studies involving multi-user behavior for complex commercial buildings need to more understand the impact of the occupant behavior on building performance. Originality/value This study is considered as a good opportunity to achieve the national strategy by showing a suitable tool to help stakeholders in the design phase of new or retrofitted buildings to improve the performance of office buildings.Keywords: occupant behavior, co-simulation, energy consumption, thermal comfort
Procedia PDF Downloads 1081014 Utilizing Laser Cutting Method in Men's' Custom-Made Casualwear
Authors: M A. Habit, S. A. Syed-Sahil, A. Bahari
Abstract:
Abstract—Laser cutting is a method of manufacturing process that uses laser in order to cut materials. It provides and ensures extreme accuracy which has a clean cut effect, CO2 laser dominate this application due to their good- quality beam combined with high output power. It comes with a small scale and it has a limitation in cutting sizes of materials, therefore it is more appropriate for custom- made products. The same laser cutting machine is also capable in cutting fine material such as fine silk, cotton, leather, polyester, etc. Lack of explorations and knowledge besides being unaware about this technology had caused many of the designers not to use this laser cutting method in their collections. The objectives of this study are: 1) To identify the potential of laser cutting technique in Custom-Made Garments for men’s casual wear: 2) To experiment the laser cutting technique in custom made garments: 3) To offer guidelines and formula for men’s custom- made casualwear designs with aesthetic value. In order to achieve the objectives, this research has been conducted by using mixed methods which are interviews with two (2) local experts in the apparel manufacturing industries and interviews via telephone with five (5) local respondents who are local emerging fashion designers, the questionnaires were distributed to one hundred (100) respondents around Klang Valley, in order to gain the information about their understanding and awareness regarding laser cutting technology. The experiment was conducted by using natural and man- made fibers. As a conclusion, all of the objectives had been achieved in producing custom-made men’s casualwear and with the production of these attires it will help to educate and enhance the innovation in fine technology. Therefore, there will be a good linkage and collaboration between the design experts and the manufacturing companies.Keywords: custom-made, fashion, laser cut, men’s wear
Procedia PDF Downloads 4461013 Geospatial Techniques and VHR Imagery Use for Identification and Classification of Slums in Gujrat City, Pakistan
Authors: Muhammad Ameer Nawaz Akram
Abstract:
The 21st century has been revealed that many individuals around the world are living in urban settlements than in rural zones. The evolution of numerous cities in emerging and newly developed countries is accompanied by the rise of slums. The precise definition of a slum varies countries to countries, but the universal harmony is that slums are dilapidated settlements facing severe poverty and have lacked access to sanitation, water, electricity, good living styles, and land tenure. The slum settlements always vary in unique patterns within and among the countries and cities. The core objective of this study is the spatial identification and classification of slums in Gujrat city Pakistan from very high-resolution GeoEye-1 (0.41m) satellite imagery. Slums were first identified using GPS for sample site identification and ground-truthing; through this process, 425 slums were identified. Then Object-Oriented Analysis (OOA) was applied to classify slums on digital image. Spatial analysis softwares, e.g., ArcGIS 10.3, Erdas Imagine 9.3, and Envi 5.1, were used for processing data and performing the analysis. Results show that OOA provides up to 90% accuracy for the identification of slums. Jalal Cheema and Allah Ho colonies are severely affected by slum settlements. The ratio of criminal activities is also higher here than in other areas. Slums are increasing with the passage of time in urban areas, and they will be like a hazardous problem in coming future. So now, the executive bodies need to make effective policies and move towards the amelioration process of the city.Keywords: slums, GPS, satellite imagery, object oriented analysis, zonal change detection
Procedia PDF Downloads 1371012 Applying Kinect on the Development of a Customized 3D Mannequin
Authors: Shih-Wen Hsiao, Rong-Qi Chen
Abstract:
In the field of fashion design, 3D Mannequin is a kind of assisting tool which could rapidly realize the design concepts. While the concept of 3D Mannequin is applied to the computer added fashion design, it will connect with the development and the application of design platform and system. Thus, the situation mentioned above revealed a truth that it is very critical to develop a module of 3D Mannequin which would correspond with the necessity of fashion design. This research proposes a concrete plan that developing and constructing a system of 3D Mannequin with Kinect. In the content, ergonomic measurements of objective human features could be attained real-time through the implement with depth camera of Kinect, and then the mesh morphing can be implemented through transformed the locations of the control-points on the model by inputting those ergonomic data to get an exclusive 3D mannequin model. In the proposed methodology, after the scanned points from the Kinect are revised for accuracy and smoothening, a complete human feature would be reconstructed by the ICP algorithm with the method of image processing. Also, the objective human feature could be recognized to analyze and get real measurements. Furthermore, the data of ergonomic measurements could be applied to shape morphing for the division of 3D Mannequin reconstructed by feature curves. Due to a standardized and customer-oriented 3D Mannequin would be generated by the implement of subdivision, the research could be applied to the fashion design or the presentation and display of 3D virtual clothes. In order to examine the practicality of research structure, a system of 3D Mannequin would be constructed with JAVA program in this study. Through the revision of experiments the practicability-contained research result would come out.Keywords: 3D mannequin, kinect scanner, interactive closest point, shape morphing, subdivision
Procedia PDF Downloads 3121011 Computer Aided Diagnosis Bringing Changes in Breast Cancer Detection
Authors: Devadrita Dey Sarkar
Abstract:
Regardless of the many technologic advances in the past decade, increased training and experience, and the obvious benefits of uniform standards, the false-negative rate in screening mammography remains unacceptably high .A computer aided neural network classification of regions of suspicion (ROS) on digitized mammograms is presented in this abstract which employs features extracted by a new technique based on independent component analysis. CAD is a concept established by taking into account equally the roles of physicians and computers, whereas automated computer diagnosis is a concept based on computer algorithms only. With CAD, the performance by computers does not have to be comparable to or better than that by physicians, but needs to be complementary to that by physicians. In fact, a large number of CAD systems have been employed for assisting physicians in the early detection of breast cancers on mammograms. A CAD scheme that makes use of lateral breast images has the potential to improve the overall performance in the detection of breast lumps. Because breast lumps can be detected reliably by computer on lateral breast mammographs, radiologists’ accuracy in the detection of breast lumps would be improved by the use of CAD, and thus early diagnosis of breast cancer would become possible. In the future, many CAD schemes could be assembled as packages and implemented as a part of PACS. For example, the package for breast CAD may include the computerized detection of breast nodules, as well as the computerized classification of benign and malignant nodules. In order to assist in the differential diagnosis, it would be possible to search for and retrieve images (or lesions) with these CAD systems, which would be reliable and useful method for quantifying the similarity of a pair of images for visual comparison by radiologists.Keywords: CAD(computer-aided design), lesions, neural network, ROS(region of suspicion)
Procedia PDF Downloads 4571010 On the Accuracy of Basic Modal Displacement Method Considering Various Earthquakes
Authors: Seyed Sadegh Naseralavi, Sadegh Balaghi, Ehsan Khojastehfar
Abstract:
Time history seismic analysis is supposed to be the most accurate method to predict the seismic demand of structures. On the other hand, the required computational time of this method toward achieving the result is its main deficiency. While being applied in optimization process, in which the structure must be analyzed thousands of time, reducing the required computational time of seismic analysis of structures makes the optimization algorithms more practical. Apparently, the invented approximate methods produce some amount of errors in comparison with exact time history analysis but the recently proposed method namely, Complete Quadratic Combination (CQC) and Sum Root of the Sum of Squares (SRSS) drastically reduces the computational time by combination of peak responses in each mode. In the present research, the Basic Modal Displacement (BMD) method is introduced and applied towards estimation of seismic demand of main structure. Seismic demand of sampled structure is estimated by calculation of modal displacement of basic structure (in which the modal displacement has been calculated). Shear steel sampled structures are selected as case studies. The error applying the introduced method is calculated by comparison of the estimated seismic demands with exact time history dynamic analysis. The efficiency of the proposed method is demonstrated by application of three types of earthquakes (in view of time of peak ground acceleration).Keywords: time history dynamic analysis, basic modal displacement, earthquake-induced demands, shear steel structures
Procedia PDF Downloads 3601009 Synthetic Data-Driven Prediction Using GANs and LSTMs for Smart Traffic Management
Authors: Srinivas Peri, Siva Abhishek Sirivella, Tejaswini Kallakuri, Uzair Ahmad
Abstract:
Smart cities and intelligent transportation systems rely heavily on effective traffic management and infrastructure planning. This research tackles the data scarcity challenge by generating realistically synthetic traffic data from the PeMS-Bay dataset, enhancing predictive modeling accuracy and reliability. Advanced techniques like TimeGAN and GaussianCopula are utilized to create synthetic data that mimics the statistical and structural characteristics of real-world traffic. The future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is anticipated to capture both spatial and temporal correlations, further improving data quality and realism. Each synthetic data generation model's performance is evaluated against real-world data to identify the most effective models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are employed to model and predict complex temporal dependencies within traffic patterns. This holistic approach aims to identify areas with low vehicle counts, reveal underlying traffic issues, and guide targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study facilitates data-driven decision-making that improves urban mobility, safety, and the overall efficiency of city planning initiatives.Keywords: GAN, long short-term memory (LSTM), synthetic data generation, traffic management
Procedia PDF Downloads 17