Search results for: accuracy improvement
7124 Analyzing Current Transformer’s Transient and Steady State Behavior for Different Burden’s Using LabVIEW Data Acquisition Tool
Abstract:
Current transformers (CTs) are used to transform large primary currents to a small secondary current. Since most standard equipment’s are not designed to handle large primary currents the CTs have an important part in any electrical system for the purpose of Metering and Protection both of which are integral in Power system. Now a days due to advancement in solid state technology, the operation times of the protective relays have come to a few cycles from few seconds. Thus, in such a scenario it becomes important to study the transient response of the current transformers as it will play a vital role in the operating of the protective devices. This paper shows the steady state and transient behavior of current transformers and how it changes with change in connected burden. The transient and steady state response will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer characteristics with changes in burden will be discussed.Keywords: accuracy, accuracy limiting factor, burden, current transformer, instrument security factor
Procedia PDF Downloads 3447123 Algorithm Research on Traffic Sign Detection Based on Improved EfficientDet
Authors: Ma Lei-Lei, Zhou You
Abstract:
Aiming at the problems of low detection accuracy of deep learning algorithm in traffic sign detection, this paper proposes improved EfficientDet based traffic sign detection algorithm. Multi-head self-attention is introduced in the minimum resolution layer of the backbone of EfficientDet to achieve effective aggregation of local and global depth information, and this study proposes an improved feature fusion pyramid with increased vertical cross-layer connections, which improves the performance of the model while introducing a small amount of complexity, the Balanced L1 Loss is introduced to replace the original regression loss function Smooth L1 Loss, which solves the problem of balance in the loss function. Experimental results show, the algorithm proposed in this study is suitable for the task of traffic sign detection. Compared with other models, the improved EfficientDet has the best detection accuracy. Although the test speed is not completely dominant, it still meets the real-time requirement.Keywords: convolutional neural network, transformer, feature pyramid networks, loss function
Procedia PDF Downloads 997122 Accurate Mass Segmentation Using U-Net Deep Learning Architecture for Improved Cancer Detection
Authors: Ali Hamza
Abstract:
Accurate segmentation of breast ultrasound images is of paramount importance in enhancing the diagnostic capabilities of breast cancer detection. This study presents an approach utilizing the U-Net architecture for segmenting breast ultrasound images aimed at improving the accuracy and reliability of mass identification within the breast tissue. The proposed method encompasses a multi-stage process. Initially, preprocessing techniques are employed to refine image quality and diminish noise interference. Subsequently, the U-Net architecture, a deep learning convolutional neural network (CNN), is employed for pixel-wise segmentation of regions of interest corresponding to potential breast masses. The U-Net's distinctive architecture, characterized by a contracting and expansive pathway, enables accurate boundary delineation and detailed feature extraction. To evaluate the effectiveness of the proposed approach, an extensive dataset of breast ultrasound images is employed, encompassing diverse cases. Quantitative performance metrics such as the Dice coefficient, Jaccard index, sensitivity, specificity, and Hausdorff distance are employed to comprehensively assess the segmentation accuracy. Comparative analyses against traditional segmentation methods showcase the superiority of the U-Net architecture in capturing intricate details and accurately segmenting breast masses. The outcomes of this study emphasize the potential of the U-Net-based segmentation approach in bolstering breast ultrasound image analysis. The method's ability to reliably pinpoint mass boundaries holds promise for aiding radiologists in precise diagnosis and treatment planning. However, further validation and integration within clinical workflows are necessary to ascertain their practical clinical utility and facilitate seamless adoption by healthcare professionals. In conclusion, leveraging the U-Net architecture for breast ultrasound image segmentation showcases a robust framework that can significantly enhance diagnostic accuracy and advance the field of breast cancer detection. This approach represents a pivotal step towards empowering medical professionals with a more potent tool for early and accurate breast cancer diagnosis.Keywords: mage segmentation, U-Net, deep learning, breast cancer detection, diagnostic accuracy, mass identification, convolutional neural network
Procedia PDF Downloads 847121 Building Scalable and Accurate Hybrid Kernel Mapping Recommender
Authors: Hina Iqbal, Mustansar Ali Ghazanfar, Sandor Szedmak
Abstract:
Recommender systems uses artificial intelligence practices for filtering obscure information and can predict if a user likes a specified item. Kernel mapping Recommender systems have been proposed which are accurate and state-of-the-art algorithms and resolve recommender system’s design objectives such as; long tail, cold-start, and sparsity. The aim of research is to propose hybrid framework that can efficiently integrate different versions— namely item-based and user-based KMR— of KMR algorithm. We have proposed various heuristic algorithms that integrate different versions of KMR (into a unified framework) resulting in improved accuracy and elimination of problems associated with conventional recommender system. We have tested our system on publically available movies dataset and benchmark with KMR. The results (in terms of accuracy, precision, recall, F1 measure and ROC metrics) reveal that the proposed algorithm is quite accurate especially under cold-start and sparse scenarios.Keywords: Kernel Mapping Recommender Systems, hybrid recommender systems, cold start, sparsity, long tail
Procedia PDF Downloads 3417120 Improvement of Production of γ-Aminobutyric Acid by Lactobacillus plantarum Isolated from Indigenous Fermented Durian (Tempoyak)
Authors: Yetti Marlida, Harnentis, Yuliaty Shafan Nur
Abstract:
Background: Tempoyak is a dish derived from fermented durian fruit. Tempoyak is a food consumed as a side dish when eating rice. Besides being eaten with rice, tempoyak can also be eaten directly. But this is rarely done because many cannot stand the sour taste and aroma of the tempoyak itself. In addition, tempoyak can also be used as a seasoning. The taste of tempoyak is acidic, this occurs because of the fermentation process in durian fruit meat which is the raw material. Tempoyak is already very well known in Indonesia, especially in Padang, Bengkulu, Palembang, Lampung, and Kalimantan. Besides that, this food is also famous in Malaysia. The purpose of this research is to improvement production of γ-aminobutyric acid (GABA) by Lactobacillus plantarum isolated from indigenous fermented durian (tempoyak). Selected Lactic Acid Bacteria (LAB) previously isolated from indigenous fermented durian (tempoyak) that have ability to produce γ-aminobutyric acid (GABA). The study was started with identification of selected LAB by 16 S RNA, followed optimation of GABA production by culture condition using different initial pH, temperature, glutamate concentration, incubation time, carbon and nitrogen sources. Results: The result from indentification used polymerase chain reaction of 16S rRNA gene sequences and phylogenetic analysis was Lactobacillus plantarum (coded as Y3) with a sequenced length of 1400bp. The improvement of Gaba production was found highest at pH: 6.0; temperature: 30 °C; glutamate concentration: 0.4%; incubation time: 60 h; glucose and yeast extract as carbon and nitrogen sources. Conclusions: GABA can be produced with the optimum condition fermentation were 66.06 mM.Keywords: lactic acid bacteria, γ-amino butyric acid, indigenous fermented durian, PCR
Procedia PDF Downloads 1437119 Effect of Omega-3 Supplementation on Stunted Egyptian Children at Risk of Environmental Enteric Dysfunction: An Interventional Study
Authors: Ghada M. El-Kassas, Maged A. El Wakeel, Salwa R. El-Zayat
Abstract:
Background: Environmental enteric dysfunction (EED) is asymptomatic villous atrophy of the small bowel that is prevalent in the developing world and is associated with altered intestinal function and integrity. Evidence has suggested that supplementary omega-3 might ameliorate this damage by reducing gastrointestinal inflammation and may also benefit cognitive development. Objective: We tested whether omega-3 supplementation improves intestinal integrity, growth, and cognitive function in stunted children predicted to have EED. Methodology: 100 Egyptian stunted children aged 1-5 years and 100 age and gender-matched normal children as controls. At the primary phase of the study, we assessed anthropometric measures and fecal markers such as myeloperoxidase (MPO), neopterin (NEO), and alpha-1-anti-trypsin (AAT) (as predictors of EED). Cognitive development was assessed (Bayley or Wechsler scores). Oral n-3 (omega-3) LC-PUFA at a dosage of 500 mg/d was supplemented to all cases and followed up for 6 months after which the 2ry phase of the study included the previous clinical, laboratory and cognitive assessment. Results: Fecal inflammatory markers were significantly higher in cases compared to controls. (MPO), (NEO) and (AAT) showed a significant decline in cases at the end of the 2ry phase (P < 0.001 for all). Omega-3 supplementation resulted also in a significant increase in mid-upper arm circumference (MUAC) (P < 0.01), weight for age z-score, and skinfold thicknesses (P< 0.05 for both). Cases showed significant improvement of cognitive function at phase 2 of the study. Conclusions: Omega-3 supplementation successfully improved intestinal inflammatory state related to EED. Also, some improvement of anthropometric and cognitive parameters showed obvious improvement with omega-3 supplementation.Keywords: cognitive functions, EED, omega-3, stunting
Procedia PDF Downloads 1507118 Capability of Available Seismic Soil Liquefaction Potential Assessment Models Based on Shear-Wave Velocity Using Banchu Case History
Authors: Nima Pirhadi, Yong Bo Shao, Xusheng Wa, Jianguo Lu
Abstract:
Several models based on the simplified method introduced by Seed and Idriss (1971) have been developed to assess the liquefaction potential of saturated sandy soils. The procedure includes determining the cyclic resistance of the soil as the cyclic resistance ratio (CRR) and comparing it with earthquake loads as cyclic stress ratio (CSR). Of all methods to determine CRR, the methods using shear-wave velocity (Vs) are common because of their low sensitivity to the penetration resistance reduction caused by fine content (FC). To evaluate the capability of the models, based on the Vs., the new data from Bachu-Jianshi earthquake case history collected, then the prediction results of the models are compared to the measured results; consequently, the accuracy of the models are discussed via three criteria and graphs. The evaluation demonstrates reasonable accuracy of the models in the Banchu region.Keywords: seismic liquefaction, banchu-jiashi earthquake, shear-wave velocity, liquefaction potential evaluation
Procedia PDF Downloads 2417117 Using Photogrammetric Techniques to Map the Mars Surface
Authors: Ahmed Elaksher, Islam Omar
Abstract:
For many years, Mars surface has been a mystery for scientists. Lately with the help of geospatial data and photogrammetric procedures researchers were able to capture some insights about this planet. Two of the most imperative data sources to explore Mars are the The High Resolution Imaging Science Experiment (HiRISE) and the Mars Orbiter Laser Altimeter (MOLA). HiRISE is one of six science instruments carried by the Mars Reconnaissance Orbiter, launched August 12, 2005, and managed by NASA. The MOLA sensor is a laser altimeter carried by the Mars Global Surveyor (MGS) and launched on November 7, 1996. In this project, we used MOLA-based DEMs to orthorectify HiRISE optical images for generating a more accurate and trustful surface of Mars. The MOLA data was interpolated using the kriging interpolation technique. Corresponding tie points were digitized from both datasets. These points were employed in co-registering both datasets using GIS analysis tools. In this project, we employed three different 3D to 2D transformation models. These are the parallel projection (3D affine) transformation model; the extended parallel projection transformation model; the Direct Linear Transformation (DLT) model. A set of tie-points was digitized from both datasets. These points were split into two sets: Ground Control Points (GCPs), used to evaluate the transformation parameters using least squares adjustment techniques, and check points (ChkPs) to evaluate the computed transformation parameters. Results were evaluated using the RMSEs between the precise horizontal coordinates of the digitized check points and those estimated through the transformation models using the computed transformation parameters. For each set of GCPs, three different configurations of GCPs and check points were tested, and average RMSEs are reported. It was found that for the 2D transformation models, average RMSEs were in the range of five meters. Increasing the number of GCPs from six to ten points improve the accuracy of the results with about two and half meters. Further increasing the number of GCPs didn’t improve the results significantly. Using the 3D to 2D transformation parameters provided three to two meters accuracy. Best results were reported using the DLT transformation model. However, increasing the number of GCPS didn’t have substantial effect. The results support the use of the DLT model as it provides the required accuracy for ASPRS large scale mapping standards. However, well distributed sets of GCPs is a key to provide such accuracy. The model is simple to apply and doesn’t need substantial computations.Keywords: mars, photogrammetry, MOLA, HiRISE
Procedia PDF Downloads 597116 COVID-19 Analysis with Deep Learning Model Using Chest X-Rays Images
Authors: Uma Maheshwari V., Rajanikanth Aluvalu, Kumar Gautam
Abstract:
The COVID-19 disease is a highly contagious viral infection with major worldwide health implications. The global economy suffers as a result of COVID. The spread of this pandemic disease can be slowed if positive patients are found early. COVID-19 disease prediction is beneficial for identifying patients' health problems that are at risk for COVID. Deep learning and machine learning algorithms for COVID prediction using X-rays have the potential to be extremely useful in solving the scarcity of doctors and clinicians in remote places. In this paper, a convolutional neural network (CNN) with deep layers is presented for recognizing COVID-19 patients using real-world datasets. We gathered around 6000 X-ray scan images from various sources and split them into two categories: normal and COVID-impacted. Our model examines chest X-ray images to recognize such patients. Because X-rays are commonly available and affordable, our findings show that X-ray analysis is effective in COVID diagnosis. The predictions performed well, with an average accuracy of 99% on training photographs and 88% on X-ray test images.Keywords: deep CNN, COVID–19 analysis, feature extraction, feature map, accuracy
Procedia PDF Downloads 817115 Classification of Political Affiliations by Reduced Number of Features
Authors: Vesile Evrim, Aliyu Awwal
Abstract:
By the evolvement in technology, the way of expressing opinions switched the direction to the digital world. The domain of politics as one of the hottest topics of opinion mining research merged together with the behavior analysis for affiliation determination in text which constitutes the subject of this paper. This study aims to classify the text in news/blogs either as Republican or Democrat with the minimum number of features. As an initial set, 68 features which 64 are constituted by Linguistic Inquiry and Word Count (LIWC) features are tested against 14 benchmark classification algorithms. In the later experiments, the dimensions of the feature vector reduced based on the 7 feature selection algorithms. The results show that Decision Tree, Rule Induction and M5 Rule classifiers when used with SVM and IGR feature selection algorithms performed the best up to 82.5% accuracy on a given dataset. Further tests on a single feature and the linguistic based feature sets showed the similar results. The feature “function” as an aggregate feature of the linguistic category, is obtained as the most differentiating feature among the 68 features with 81% accuracy by itself in classifying articles either as Republican or Democrat.Keywords: feature selection, LIWC, machine learning, politics
Procedia PDF Downloads 3837114 Specific Emitter Identification Based on Refined Composite Multiscale Dispersion Entropy
Authors: Shaoying Guo, Yanyun Xu, Meng Zhang, Weiqing Huang
Abstract:
The wireless communication network is developing rapidly, thus the wireless security becomes more and more important. Specific emitter identification (SEI) is an vital part of wireless communication security as a technique to identify the unique transmitters. In this paper, a SEI method based on multiscale dispersion entropy (MDE) and refined composite multiscale dispersion entropy (RCMDE) is proposed. The algorithms of MDE and RCMDE are used to extract features for identification of five wireless devices and cross-validation support vector machine (CV-SVM) is used as the classifier. The experimental results show that the total identification accuracy is 99.3%, even at low signal-to-noise ratio(SNR) of 5dB, which proves that MDE and RCMDE can describe the communication signal series well. In addition, compared with other methods, the proposed method is effective and provides better accuracy and stability for SEI.Keywords: cross-validation support vector machine, refined com- posite multiscale dispersion entropy, specific emitter identification, transient signal, wireless communication device
Procedia PDF Downloads 1307113 REFLEX: A Randomized Controlled Trial to Test the Efficacy of an Emotion Regulation Flexibility Program with Daily Measures
Authors: Carla Nardelli, Jérome Holtzmann, Céline Baeyens, Catherine Bortolon
Abstract:
Background. Emotion regulation (ER) is a process associated with difficulties in mental health. Given its transdiagnostic features, its improvement could facilitate the recovery of various psychological issues. A limit of current studies is the lack of knowledge regarding whether available interventionsimprove ER flexibility (i.e., the ability to implement ER strategies in line with contextual demands), even though this capacity has been associated with better mental health and well-being. Therefore, the aim of the study is to test the efficacy of a 9-weeks ER group program (the Affect Regulation Training-ART), using the most appropriate measures (i.e., experience sampling method) in a student population. Plus, the goal of the study is to explore the potential mediative role of ER flexibility on mental health improvement. Method. This Randomized Controlled Trial will comparethe ER program group to an active control group (a relaxation program) in 100 participants. To test the mediative role of ER flexibility on mental health, daily measures will be used before, during, and after the interventions to evaluate the extent to which participants are flexible in their ER. Expected outcomes. Using multilevel analyses, we expect an improvement in anxious-depressive symptomatology for both groups. However, we expect the ART group to improve specifically on ER flexibility ability and the last to be a mediative variable on mental health. Conclusion. This study will enhance knowledge on interventions for students and the impact of interventions on ER flexibility. Also, this research will improve knowledge on ecological measures for assessing the effect of interventions. Overall, this project represents new opportunities to improve ER skills to improve mental health in undergraduate students.Keywords: emotion regulation flexibility, experience sampling method, psychological intervention, emotion regulation skills
Procedia PDF Downloads 1377112 Analysis of Bored Piles with and without Geogrid in a Selected Area in Kocaeli/Turkey
Authors: Utkan Mutman, Cihan Dirlik
Abstract:
Kocaeli/TURKEY district in which wastewater held in a chosen field increased property has made piling in order to improve the ground under the aeration basin. In this study, the degree of improvement the ground after bored piling held in the field were investigated. In this context, improving the ground before and after the investigation was carried out and that the solution values obtained by the finite element method analysis using Plaxis program have been made. The diffuses in the aeration basin whose treatment is to aide is influenced with and without geogrid on the ground. On the ground been improved, for the purpose of control of manufactured bored piles, pile continuity, and pile load tests were made. Taking into consideration both the data in the field as well as dynamic loads in the aeration basic, an analysis was made on Plaxis program and compared the data obtained from the analysis result and data obtained in the field.Keywords: geogrid, bored pile, soil improvement, plaxis
Procedia PDF Downloads 2687111 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features
Authors: Bushra Zafar, Usman Qamar
Abstract:
Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection
Procedia PDF Downloads 3187110 Investigating Data Normalization Techniques in Swarm Intelligence Forecasting for Energy Commodity Spot Price
Authors: Yuhanis Yusof, Zuriani Mustaffa, Siti Sakira Kamaruddin
Abstract:
Data mining is a fundamental technique in identifying patterns from large data sets. The extracted facts and patterns contribute in various domains such as marketing, forecasting, and medical. Prior to that, data are consolidated so that the resulting mining process may be more efficient. This study investigates the effect of different data normalization techniques, which are Min-max, Z-score, and decimal scaling, on Swarm-based forecasting models. Recent swarm intelligence algorithms employed includes the Grey Wolf Optimizer (GWO) and Artificial Bee Colony (ABC). Forecasting models are later developed to predict the daily spot price of crude oil and gasoline. Results showed that GWO works better with Z-score normalization technique while ABC produces better accuracy with the Min-Max. Nevertheless, the GWO is more superior that ABC as its model generates the highest accuracy for both crude oil and gasoline price. Such a result indicates that GWO is a promising competitor in the family of swarm intelligence algorithms.Keywords: artificial bee colony, data normalization, forecasting, Grey Wolf optimizer
Procedia PDF Downloads 4787109 Improvement of Overall Equipment Effectiveness of Load Haul Dump Machines in Underground Coal Mines
Authors: J. BalaRaju, M. Govinda Raj, C. S. N. Murthy
Abstract:
Every organization in the competitive world tends to improve its economy by increasing their production and productivity rates. Unequivocally, the production in Indian underground mines over the years is not satisfactory, due to a variety of reasons. There are manifold of avenues for the betterment of production, and one such approach is through enhanced utilization of mechanized equipment such as Load Haul Dumper (LHD). This is used as loading and hauling purpose in underground mines. In view of the aforementioned facts, this paper delves into identification of the key influencing factors such as LHDs maintenance effectiveness, vehicle condition, operator skill and utilization of the machines on performance of LHDs. An attempt has been made for improvement of performance of the equipment through evaluation of Overall Equipment Effectiveness (OEE). Two different approaches for evaluation of OEE have been adopted and compared under various operating conditions. The use of OEE calculation in terms of percentage availability, performance and quality and the hitherto existing situation of the underground mine production is evaluated. Necessary recommendations are suggested to mining industry on the basis of OEE.Keywords: utilization, maintenance, availability, performance and quality
Procedia PDF Downloads 2227108 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data
Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar
Abstract:
It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.Keywords: accuracy, exponential smoothing, forecasting, initial value
Procedia PDF Downloads 1777107 A Supervised Approach for Word Sense Disambiguation Based on Arabic Diacritics
Authors: Alaa Alrakaf, Sk. Md. Mizanur Rahman
Abstract:
Since the last two decades’ Arabic natural language processing (ANLP) has become increasingly much more important. One of the key issues related to ANLP is ambiguity. In Arabic language different pronunciation of one word may have a different meaning. Furthermore, ambiguity also has an impact on the effectiveness and efficiency of Machine Translation (MT). The issue of ambiguity has limited the usefulness and accuracy of the translation from Arabic to English. The lack of Arabic resources makes ambiguity problem more complicated. Additionally, the orthographic level of representation cannot specify the exact meaning of the word. This paper looked at the diacritics of Arabic language and used them to disambiguate a word. The proposed approach of word sense disambiguation used Diacritizer application to Diacritize Arabic text then found the most accurate sense of an ambiguous word using Naïve Bayes Classifier. Our Experimental study proves that using Arabic Diacritics with Naïve Bayes Classifier enhances the accuracy of choosing the appropriate sense by 23% and also decreases the ambiguity in machine translation.Keywords: Arabic natural language processing, machine learning, machine translation, Naive bayes classifier, word sense disambiguation
Procedia PDF Downloads 3597106 Implementation of ALD in Product Development: Study of ROPS to Improve Energy Absorption Performance Using Absorption Part
Authors: Zefry Darmawan, Shigeyuki Haruyama, Ken Kaminishi
Abstract:
Product development is a big issue in the industrial competition and takes a serious part in development of technology. Product development process could adapt high changes of market needs and transform into engineering concept in order to produce high-quality product. One of the latest methods in product development is Analysis-Led-Design (ALD). It utilizes digital engineering design tools with finite analysis to perform product robust analysis and valuable for product reliability assurance. Heavy machinery which operates under severe condition should maintain safety to the customer when faced with potential hazard. Cab frame should able to absorb the energy while collision. Through ALD, a series of improvement of cab frame to increase energy absorption was made and analyzed. Improvement was made by modifying shapes of frame and-or install absorption device in certain areas. Simulation result showed that install absorption device could increase absorption energy than modifying shape.Keywords: ALD, ROPS, energy absorption, cab frame
Procedia PDF Downloads 3727105 Resistivity Tomography Optimization Based on Parallel Electrode Linear Back Projection Algorithm
Authors: Yiwei Huang, Chunyu Zhao, Jingjing Ding
Abstract:
Electrical Resistivity Tomography has been widely used in the medicine and the geology, such as the imaging of the lung impedance and the analysis of the soil impedance, etc. Linear Back Projection is the core algorithm of Electrical Resistivity Tomography, but the traditional Linear Back Projection can not make full use of the information of the electric field. In this paper, an imaging method of Parallel Electrode Linear Back Projection for Electrical Resistivity Tomography is proposed, which generates the electric field distribution that is not linearly related to the traditional Linear Back Projection, captures the new information and improves the imaging accuracy without increasing the number of electrodes by changing the connection mode of the electrodes. The simulation results show that the accuracy of the image obtained by the inverse operation obtained by the Parallel Electrode Linear Back Projection can be improved by about 20%.Keywords: electrical resistivity tomography, finite element simulation, image optimization, parallel electrode linear back projection
Procedia PDF Downloads 1547104 SC-LSH: An Efficient Indexing Method for Approximate Similarity Search in High Dimensional Space
Authors: Sanaa Chafik, Imane Daoudi, Mounim A. El Yacoubi, Hamid El Ouardi
Abstract:
Locality Sensitive Hashing (LSH) is one of the most promising techniques for solving nearest neighbour search problem in high dimensional space. Euclidean LSH is the most popular variation of LSH that has been successfully applied in many multimedia applications. However, the Euclidean LSH presents limitations that affect structure and query performances. The main limitation of the Euclidean LSH is the large memory consumption. In order to achieve a good accuracy, a large number of hash tables is required. In this paper, we propose a new hashing algorithm to overcome the storage space problem and improve query time, while keeping a good accuracy as similar to that achieved by the original Euclidean LSH. The Experimental results on a real large-scale dataset show that the proposed approach achieves good performances and consumes less memory than the Euclidean LSH.Keywords: approximate nearest neighbor search, content based image retrieval (CBIR), curse of dimensionality, locality sensitive hashing, multidimensional indexing, scalability
Procedia PDF Downloads 3227103 Impact of Marine Hydrodynamics and Coastal Morphology on Changes in Mangrove Forests (Case Study: West of Strait of Hormuz, Iran)
Authors: Fatemeh Parhizkar, Mojtaba Yamani, Abdolla Behboodi, Masoomeh Hashemi
Abstract:
The mangrove forests are natural and valuable gifts that exist in some parts of the world, including Iran. Regarding the threats faced by these forests and the declining area of them all over the world, as well as in Iran, it is very necessary to manage and monitor them. The current study aimed to investigate the changes in mangrove forests and the relationship between these changes and the marine hydrodynamics and coastal morphology in the area between qeshm island and the west coast of the Hormozgan province (i.e. the coastline between Mehran river and Bandar-e Pol port) in the 49-year period. After preprocessing and classifying satellite images using the SVM, MLC, and ANN classifiers and evaluating the accuracy of the maps, the SVM approach with the highest accuracy (the Kappa coefficient of 0.97 and overall accuracy of 98) was selected for preparing the classification map of all images. The results indicate that from 1972 to 1987, the area of these forests have had experienced a declining trend, and in the next years, their expansion was initiated. These forests include the mangrove forests of Khurkhuran wetland, Muriz Deraz Estuary, Haft Baram Estuary, the mangrove forest in the south of the Laft Port, and the mangrove forests between the Tabl Pier, Maleki Village, and Gevarzin Village. The marine hydrodynamic and geomorphological characteristics of the region, such as average intertidal zone, sediment data, the freshwater inlet of Mehran river, wave stability and calmness, topography and slope, as well as mangrove conservation projects make the further expansion of mangrove forests in this area possible. By providing significant and up-to-date information on the development and decline of mangrove forests in different parts of the coast, this study can significantly contribute to taking measures for the conservation and restoration of mangrove forests.Keywords: mangrove forests, marine hydrodynamics, coastal morphology, west of strait of Hormuz, Iran
Procedia PDF Downloads 977102 Improving Concrete Properties with Fibers Addition
Authors: E. Mello, C. Ribellato, E. Mohamedelhassan
Abstract:
This study investigated the improvement in concrete properties with addition of cellulose, steel, carbon and PET fibers. Each fiber was added at four percentages to the fresh concrete, which was moist-cured for 28-days and then tested for compressive, flexural and tensile strengths. Changes in strength and increases in cost were analyzed. Results showed that addition of cellulose caused a decrease between 9.8% and 16.4% in compressive strength. This range may be acceptable as cellulose fibers can significantly increase the concrete resistance to fire, and freezing and thawing cycles. Addition of steel fibers to concrete increased the compressive strength by up to 20%. Increases 121.5% and 80.7% were reported in tensile and flexural strengths respectively. Carbon fibers increased flexural and tensile strengths by up to 11% and 45%, respectively. Concrete strength properties decreased after the addition of PET fibers. Results showed that improvement in strength after addition of steel and carbon fibers may justify the extra cost of fibers.Keywords: concrete, compressive strength, fibers, flexural strength, tensile strength
Procedia PDF Downloads 4437101 Modeling of Geotechnical Data Using GIS and Matlab for Eastern Ahmedabad City, Gujarat
Authors: Rahul Patel, S. P. Dave, M. V Shah
Abstract:
Ahmedabad is a rapidly growing city in western India that is experiencing significant urbanization and industrialization. With projections indicating that it will become a metropolitan city in the near future, various construction activities are taking place, making soil testing a crucial requirement before construction can commence. To achieve this, construction companies and contractors need to periodically conduct soil testing. This study focuses on the process of creating a spatial database that is digitally formatted and integrated with geotechnical data and a Geographic Information System (GIS). Building a comprehensive geotechnical Geo-database involves three essential steps. Firstly, borehole data is collected from reputable sources. Secondly, the accuracy and redundancy of the data are verified. Finally, the geotechnical information is standardized and organized for integration into the database. Once the Geo-database is complete, it is integrated with GIS. This integration allows users to visualize, analyze, and interpret geotechnical information spatially. Using a Topographic to Raster interpolation process in GIS, estimated values are assigned to all locations based on sampled geotechnical data values. The study area was contoured for SPT N-Values, Soil Classification, Φ-Values, and Bearing Capacity (T/m2). Various interpolation techniques were cross-validated to ensure information accuracy. The GIS map generated by this study enables the calculation of SPT N-Values, Φ-Values, and bearing capacities for different footing widths and various depths. This approach highlights the potential of GIS in providing an efficient solution to complex phenomena that would otherwise be tedious to achieve through other means. Not only does GIS offer greater accuracy, but it also generates valuable information that can be used as input for correlation analysis. Furthermore, this system serves as a decision support tool for geotechnical engineers. The information generated by this study can be utilized by engineers to make informed decisions during construction activities. For instance, they can use the data to optimize foundation designs and improve site selection. In conclusion, the rapid growth experienced by Ahmedabad requires extensive construction activities, necessitating soil testing. This study focused on the process of creating a comprehensive geotechnical database integrated with GIS. The database was developed by collecting borehole data from reputable sources, verifying its accuracy and redundancy, and organizing the information for integration. The GIS map generated by this study is an efficient solution that offers greater accuracy and generates valuable information that can be used as input for correlation analysis. It also serves as a decision support tool for geotechnical engineers, allowing them to make informed decisions during construction activities.Keywords: arcGIS, borehole data, geographic information system (GIS), geo-database, interpolation, SPT N-value, soil classification, φ-value, bearing capacity
Procedia PDF Downloads 707100 Screening of Different Native Genotypes of Broadleaf Mustard against Different Diseases
Authors: Nisha Thapa, Ram Prasad Mainali, Prakriti Chand
Abstract:
Broadleaf mustard is a commercialized leafy vegetable of Nepal. However, its utilization is hindered in terms of production and productivity due to the high intensity of insects, pests, and diseases causing great loss. The plant protection part of the crop’s disease and damage intensity has not been studied much from research perspectives in Nepal. The research aimed to evaluate broadleaf mustard genotypes for resistance against different diseases. A total of 35 native genotypes of broadleaf mustard were screened at weekly intervals by scoring the plants for ten weeks. Five different diseases, such as Rhizoctonia root rot, Alternaria blight, black rot, turnip mosaic virus disease, and white rust, were reported from the broad leaf mustard genotypes. Out of 35 genotypes, 23 genotypes were found with very high Rhizoctonia Root Rot severity, whereas 8 genotypes showed very high Alternaria blight severity. Likewise, 3 genotypes were found with high Black rot severity, and 1 genotype was found with very high Turnip mosaic virus disease incidence. Similarly, 2 genotypes were found to have very high White rust severity. Among the disease of national importance, Rhizoctonia root rot was found to be the most severe disease with the greatest loss. Broadleaf mustard genotypes like Rato Rayo, CO 1002, and CO 11007 showed average to the high level of field resistance; therefore, these genotypes should be used, conserved, and stored in a mustard improvement program as the disease resistance quality or susceptibility of these genotypes can be helpful for seed producing farmers, companies and other stakeholders through varietal improvement and developmental works that further aids in sustainable disease management of the vegetable.Keywords: genotype, disease resistance, Rhizoctonia root rot severity, varietal improvement
Procedia PDF Downloads 817099 Energy Consumption Forecast Procedure for an Industrial Facility
Authors: Tatyana Aleksandrovna Barbasova, Lev Sergeevich Kazarinov, Olga Valerevna Kolesnikova, Aleksandra Aleksandrovna Filimonova
Abstract:
We regard forecasting of energy consumption by private production areas of a large industrial facility as well as by the facility itself. As for production areas the forecast is made based on empirical dependencies of the specific energy consumption and the production output. As for the facility itself implementation of the task to minimize the energy consumption forecasting error is based on adjustment of the facility’s actual energy consumption values evaluated with the metering device and the total design energy consumption of separate production areas of the facility. The suggested procedure of optimal energy consumption was tested based on the actual data of core product output and energy consumption by a group of workshops and power plants of the large iron and steel facility. Test results show that implementation of this procedure gives the mean accuracy of energy consumption forecasting for winter 2014 of 0.11% for the group of workshops and 0.137% for the power plants.Keywords: energy consumption, energy consumption forecasting error, energy efficiency, forecasting accuracy, forecasting
Procedia PDF Downloads 4467098 Modelling Phase Transformations in Zircaloy-4 Fuel Cladding under Transient Heating Rates
Authors: Jefri Draup, Antoine Ambard, Chi-Toan Nguyen
Abstract:
Zirconium alloys exhibit solid-state phase transformations under thermal loading. These can lead to a significant evolution of the microstructure and associated mechanical properties of materials used in nuclear fuel cladding structures. Therefore, the ability to capture effects of phase transformation on the material constitutive behavior is of interest during conditions of severe transient thermal loading. Whilst typical Avrami, or Johnson-Mehl-Avrami-Kolmogorov (JMAK), type models for phase transformations have been shown to have a good correlation with the behavior of Zircaloy-4 under constant heating rates, the effects of variable and fast heating rates are not fully explored. The present study utilises the results of in-situ high energy synchrotron X-ray diffraction (SXRD) measurements in order to validate the phase transformation models for Zircaloy-4 under fast variable heating rates. These models are used to assess the performance of fuel cladding structures under loss of coolant accident (LOCA) scenarios. The results indicate that simple Avrami type models can provide a reasonable indication of the phase distribution in experimental test specimens under variable fast thermal loading. However, the accuracy of these models deteriorates under the faster heating regimes, i.e., 100Cs⁻¹. The studies highlight areas for improvement of simple Avrami type models, such as the inclusion of temperature rate dependence of the JMAK n-exponent.Keywords: accident, fuel, modelling, zirconium
Procedia PDF Downloads 1427097 A Power Management System for Indoor Micro-Drones in GPS-Denied Environments
Authors: Yendo Hu, Xu-Yu Wu, Dylan Oh
Abstract:
GPS-Denied drones open the possibility of indoor applications, including dynamic arial surveillance, inspection, safety enforcement, and discovery. Indoor swarming further enhances these applications in accuracy, robustness, operational time, and coverage. For micro-drones, power management becomes a critical issue, given the battery payload restriction. This paper proposes an application enabling battery replacement solution that extends the micro-drone active phase without human intervention. First, a framework to quantify the effectiveness of a power management solution for a drone fleet is proposed. The operation-to-non-operation ratio, ONR, gives one a quantitative benchmark to measure the effectiveness of a power management solution. Second, a survey was carried out to evaluate the ONR performance for the various solutions. Third, through analysis, this paper proposes a solution tailored to the indoor micro-drone, suitable for swarming applications. The proposed automated battery replacement solution, along with a modified micro-drone architecture, was implemented along with the associated micro-drone. Fourth, the system was tested and compared with the various solutions within the industry. Results show that the proposed solution achieves an ONR value of 31, which is a 1-fold improvement of the best alternative option. The cost analysis shows a manufacturing cost of $25, which makes this approach viable for cost-sensitive markets (e.g., consumer). Further challenges remain in the area of drone design for automated battery replacement, landing pad/drone production, high-precision landing control, and ONR improvements.Keywords: micro-drone, battery swap, battery replacement, battery recharge, landing pad, power management
Procedia PDF Downloads 1227096 Improved Wetting for Improved Solubility and Dissolution of Candesartan Cilexetil
Authors: Shilpa Bhilegaonkar, Ram Gaud
Abstract:
Candesartan cilexetil is a poorly soluble antihypertensive agent with solubility limited bioavailability (15%). To initiate process of solubilisation, it is very much necessary to displace the air at the surface and wet the drug surface with a solvent, with which drug is compatible. Present research adopts the same principle to improve solubility and dissolution of candesartan cilexetil. Solvents used here are surfactant and modified surfactant in different drug: solvent (1:1-1:9) ratio’s for preparation of adsorbates. Adsorbates were then converted into free flowing powders as liquisolid compacts and compressed to form tablets. Liquisolid compacts were evaluated for improvement in saturation solubility and dissolution of candesartan cilexetil. All systems were evaluated for improvement in saturation solubility and dissolution in different medias such as water, 0.1 N HCl, Phosphate buffer pH 6.8 and media given by office of generic drugs along with other physicochemical testing. All systems exhibited a promising advantage in terms of solubility and dissolution without affecting the drug structure as confirmed by IR and XRD. No considerable advantage was seen of increasing solvent ratio with drug.Keywords: candesartan cilexetil, improved dissolution, solubility, liquisolid
Procedia PDF Downloads 3287095 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models
Authors: Ainouna Bouziane
Abstract:
The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.Keywords: electron tomography, supported catalysts, nanometrology, error assessment
Procedia PDF Downloads 88