Search results for: laser processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4397

Search results for: laser processing

1517 Computer Aided Design Solution Based on Genetic Algorithms for FMEA and Control Plan in Automotive Industry

Authors: Nadia Belu, Laurenţiu Mihai Ionescu, Agnieszka Misztal

Abstract:

The automotive industry is one of the most important industries in the world that concerns not only the economy, but also the world culture. In the present financial and economic context, this field faces new challenges posed by the current crisis, companies must maintain product quality, deliver on time and at a competitive price in order to achieve customer satisfaction. Two of the most recommended techniques of quality management by specific standards of the automotive industry, in the product development, are Failure Mode and Effects Analysis (FMEA) and Control Plan. FMEA is a methodology for risk management and quality improvement aimed at identifying potential causes of failure of products and processes, their quantification by risk assessment, ranking of the problems identified according to their importance, to the determination and implementation of corrective actions related. The companies use Control Plans realized using the results from FMEA to evaluate a process or product for strengths and weaknesses and to prevent problems before they occur. The Control Plans represent written descriptions of the systems used to control and minimize product and process variation. In addition Control Plans specify the process monitoring and control methods (for example Special Controls) used to control Special Characteristics. In this paper we propose a computer-aided solution with Genetic Algorithms in order to reduce the drafting of reports: FMEA analysis and Control Plan required in the manufacture of the product launch and improved knowledge development teams for future projects. The solution allows to the design team to introduce data entry required to FMEA. The actual analysis is performed using Genetic Algorithms to find optimum between RPN risk factor and cost of production. A feature of Genetic Algorithms is that they are used as a means of finding solutions for multi criteria optimization problems. In our case, along with three specific FMEA risk factors is considered and reduce production cost. Analysis tool will generate final reports for all FMEA processes. The data obtained in FMEA reports are automatically integrated with other entered parameters in Control Plan. Implementation of the solution is in the form of an application running in an intranet on two servers: one containing analysis and plan generation engine and the other containing the database where the initial parameters and results are stored. The results can then be used as starting solutions in the synthesis of other projects. The solution was applied to welding processes, laser cutting and bending to manufacture chassis for buses. Advantages of the solution are efficient elaboration of documents in the current project by automatically generating reports FMEA and Control Plan using multiple criteria optimization of production and build a solid knowledge base for future projects. The solution which we propose is a cheap alternative to other solutions on the market using Open Source tools in implementation.

Keywords: automotive industry, FMEA, control plan, automotive technology

Procedia PDF Downloads 403
1516 Non-Contact Measurement of Soil Deformation in a Cyclic Triaxial Test

Authors: Erica Elice Uy, Toshihiro Noda, Kentaro Nakai, Jonathan Dungca

Abstract:

Deformation in a conventional cyclic triaxial test is normally measured by using point-wise measuring device. In this study, non-contact measurement technique was applied to be able to monitor and measure the occurrence of non-homogeneous behavior of the soil under cyclic loading. Non-contact measurement is executed through image processing. Two-dimensional measurements were performed using Lucas and Kanade optical flow algorithm and it was implemented Labview. In this technique, the non-homogeneous deformation was monitored using a mirrorless camera. A mirrorless camera was used because it is economical and it has the capacity to take pictures at a fast rate. The camera was first calibrated to remove the distortion brought about the lens and the testing environment as well. Calibration was divided into 2 phases. The first phase was the calibration of the camera parameters and distortion caused by the lens. The second phase was to for eliminating the distortion brought about the triaxial plexiglass. A correction factor was established from this phase. A series of consolidated undrained cyclic triaxial test was performed using a coarse soil. The results from the non-contact measurement technique were compared to the measured deformation from the linear variable displacement transducer. It was observed that deformation was higher at the area where failure occurs.

Keywords: cyclic loading, non-contact measurement, non-homogeneous, optical flow

Procedia PDF Downloads 297
1515 Formulation and Optimization of Self Nanoemulsifying Drug Delivery System of Rutin for Enhancement of Oral Bioavailability Using QbD Approach

Authors: Shrestha Sharma, Jasjeet K. Sahni, Javed Ali, Sanjula Baboota

Abstract:

Introduction: Rutin is a naturally occurring strong antioxidant molecule belonging to bioflavonoid category. Due to its free radical scavenging properties, it has been found to be beneficial in the treatment of various diseases including inflammation, cancer, diabetes, allergy, cardiovascular disorders and various types of microbial infections. Despite its beneficial effects, it suffers from the problem of low aqueous solubility which is responsible for low oral bioavailability. The aim of our study was to optimize and characterize self-nanoemulsifying drug delivery system (SNEDDS) of rutin using Box-Behnken design (BBD) combined with a desirability function. Further various antioxidant, pharmacokinetic and pharmacodynamic studies were performed for the optimized rutin SNEDDS formulation. Methodologies: Selection of oil, surfactant and co-surfactant was done on the basis of solubility/miscibility studies. Sefsol+ Vitamin E, Solutol HS 15 and Transcutol P were selected as oil phase, surfactant and co-surfactant respectively. Optimization of SNEDDS formulations was done by a three-factor, three-level (33)BBD. The independent factors were Sefsol+ Vitamin E, Solutol HS15, and Transcutol P. The dependent variables were globule size, self emulsification time (SEF), % transmittance and cumulative percentage drug released. Various response surface graphs and contour plots were constructed to understand the effect of different factor, their levels and combinations on the responses. The optimized Rutin SNEDDS formulation was characterized for various parameters such as globule size, zeta potential, viscosity, refractive index , % Transmittance and in vitro drug release. Ex vivo permeation studies and pharmacokinetic studies were performed for optimized formulation. Antioxidant activity was determined by DPPH and reducing power assays. Anti-inflammatory activity was determined by using carrageenan induced rat paw oedema method. Permeation of rutin across small intestine was assessed using confocal laser scanning microscopy (CLSM). Major findings:The optimized SNEDDS formulation consisting of Sefsol+ Vitamin E - Solutol HS15 -Transcutol HP at proportions of 25:35:17.5 (w/w) was prepared and a comparison of the predicted values and experimental values were found to be in close agreement. The globule size and PDI of optimized SNEDDS formulation was found to be 16.08 ± 0.02 nm and 0.124±0.01 respectively. Significant (p˂0.05) increase in percentage drug release was achieved in the case of optimized SNEDDS formulation (98.8 %) as compared to rutin suspension. Furthermore, pharmacokinetic study showed a 2.3-fold increase in relative oral bioavailability compared with that of the suspension. Antioxidant assay results indicated better efficacy of the developed formulation than the pure drug and it was found to be comparable with ascorbic acid. The results of anti-inflammatory studies showed 72.93 % inhibition for the SNEDDS formulation which was significantly higher than the drug suspension 46.56%. The results of CLSM indicated that the absorption of SNEDDS formulation was considerably higher than that from rutin suspension. Conclusion: Rutin SNEDDS have been successfully prepared and they can serve as an effective tool in enhancing oral bioavailability and efficacy of Rutin.

Keywords: rutin, oral bioavilability, pharamacokinetics, pharmacodynamics

Procedia PDF Downloads 494
1514 The Evolution of the Human Brain from the Hind Brain to the Fore Brain: Dialectics from the African Perspective in Understanding Stunted Development in Science and Technology

Authors: Philemon Wokoma Iyagba, Obey Onenee Christie

Abstract:

From the hindbrain, which is responsible for motor activities, to the forebrain, responsible for processing information related to complex cognitive activities, the human brain has continued to evolve over the years. This evolution- has been progressive, leading to advancements in science and technology. However, the development of science and technology in Africa, where ancient civilization arguably began, has been retrogressive. Dialectics was done by dissecting different opinions on the reason behind the stunted development of science and technology in Africa. The researchers proposed that the inability to sustain the technological advancements made by early Africans is due to poor or lack of replicability of the African knowledge-based system, almost no or poor documentation of adopted procedures and the approval-seeking mentality that cheaply paved the way for westernization which also led to the adulteration of the African way of life and education without making room for incorporating her identity and proper alignment of her rich cultural heritage in education and her enormous achievements before and during the middle age. This article discussed conceptual issues, with its positions based on established facts, the discussion was based on relevant literature and recommendations were made accordingly.

Keywords: forebrain, hindbrain, dialectics from African perspective, development in science and technology

Procedia PDF Downloads 74
1513 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods

Authors: Bandar Alahmadi, Lethia Jackson

Abstract:

Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.

Keywords: adversarial examples, attack, computer vision, image processing

Procedia PDF Downloads 333
1512 The Global-Local Dimension in Cognitive Control after Left Lateral Prefrontal Cortex Damage: Evidence from the Non-Verbal Domain

Authors: Eleni Peristeri, Georgia Fotiadou, Ianthi-Maria Tsimpli

Abstract:

The local-global dimension has been studied extensively in healthy controls and preference for globally processed stimuli has been validated in both the visual and auditory modalities. Critically, the local-global dimension has an inherent interference resolution component, a type of cognitive control, and left-prefrontal-cortex-damaged (LPFC) individuals have exhibited inability to override habitual response behaviors in item recognition tasks that involve representational interference. Eight patients with damage in the left PFC (age range: 32;5 to 69;0. Mean age: 54;6 yrs) and twenty age- and education-matched language-unimpaired adults (mean age: 56;7yrs) have participated in the study. Distinct performance patterns were found between the language-unimpaired and the LPFC-damaged group which have mainly stemmed from the latter’s difficulty with inhibiting global stimuli in incongruent trials. Overall, the local-global attentional dimension affects LPFC-damaged individuals with non-fluent aphasia in non-language domains implicating distinct types of inhibitory processes depending on the level of processing.

Keywords: left lateral prefrontal cortex damage (LPFC), local-global non-language attention, representational interference, non-fluent aphasia

Procedia PDF Downloads 462
1511 Denoising of Motor Unit Action Potential Based on Tunable Band-Pass Filter

Authors: Khalida S. Rijab, Mohammed E. Safi, Ayad A. Ibrahim

Abstract:

When electrical electrodes are mounted on the skin surface of the muscle, a signal is detected when a skeletal muscle undergoes contraction; the signal is known as surface electromyographic signal (EMG). This signal has a noise-like interference pattern resulting from the temporal and spatial summation of action potentials (AP) of all active motor units (MU) near electrode detection. By appropriate processing (Decomposition), the surface EMG signal may be used to give an estimate of motor unit action potential. In this work, a denoising technique is applied to the MUAP signals extracted from the spatial filter (IB2). A set of signals from a non-invasive two-dimensional grid of 16 electrodes from different types of subjects, muscles, and sex are recorded. These signals will acquire noise during recording and detection. A digital fourth order band- pass Butterworth filter is used for denoising, with a tuned band-pass frequency of suitable choice of cutoff frequencies is investigated, with the aim of obtaining a suitable band pass frequency. Results show an improvement of (1-3 dB) in the signal to noise ratio (SNR) have been achieved, relative to the raw spatial filter output signals for all cases that were under investigation. Furthermore, the research’s goal included also estimation and reconstruction of the mean shape of the MUAP.

Keywords: EMG, Motor Unit, Digital Filter, Denoising

Procedia PDF Downloads 397
1510 Inflammatory Alleviation on Microglia Cells by an Apoptotic Mimicry

Authors: Yi-Feng Kao, Huey-Jine Chai, Chin-I Chang, Yi-Chen Chen, June-Ru Chen

Abstract:

Microglia is a macrophage that resides in brain, and overactive microglia may result in brain neuron damage or inflammation. In this study, the phospholipids was extracted from squid skin and manufactured into a liposome (SQ liposome) to mimic apoptotic body. We then evaluated anti-inflammatory effects of SQ liposome on mouse microglial cell line (BV-2) by lipopolysaccharide (LPS) induction. First, the major phospholipid constituents in the squid skin extract were including 46.2% of phosphatidylcholine, 18.4% of phosphatidylethanolamine, 7.7% of phosphatidylserine, 3.5% of phosphatidylinositol, 4.9% of Lysophosphatidylcholine and 19.3% of other phospholipids by HPLC-UV analysis. The contents of eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) in the squid skin extract were 11.8 and 28.7%, respectively. The microscopic images showed that microglia cells can engulf apoptotic cells or SQ-liposome. In cell based studies, there was no cytotoxicity to BV-2 as the concentration of SQ-liposome was less than 2.5 mg/mL. The LPS induced pro-inflammatory cytokines, including tumor necrosis factor-alpha (TNF-α) and interleukin-6 (IL-6), were significant suppressed (P < 0.05) by pretreated 0.03~2.5mg/ml SQ liposome. Oppositely, the anti-inflammatory cytokines transforming growth factor-beta (TGF-β) and interleukin-10 (IL-10) secretion were enhanced (P < 0.05). The results suggested that SQ-liposome possess anti-inflammatory properties on BV-2 and may be a good strategy for against neuro-inflammatory disease.

Keywords: apoptotic mimicry, neuroinflammation, microglia, squid processing by-products

Procedia PDF Downloads 473
1509 Anthropometric Data Variation within Gari-Frying Population

Authors: T. M. Samuel, O. O. Aremu, I. O. Ismaila, L. I. Onu, B. O. Adetifa, S. E. Adegbite, O. O. Olokoshe

Abstract:

The imperative of anthropometry in designing to fit cannot be overemphasized. Of essence is the variability of measurements among population for which data is collected. In this paper anthropometric data were collected for the design of gari-frying facility such that work system would be designed to fit the gari-frying population in the Southwestern states of Nigeria comprising Lagos, Ogun, Oyo, Osun, Ondo, and Ekiti. Twenty-seven body dimensions were measured among 120 gari-frying processors. Statistical analysis was performed using SPSS package to determine the mean, standard deviation, minimum value, maximum value and percentiles (2nd, 5th, 25th, 50th, 75th, 95th, and 98th) of the different anthropometric parameters. One sample t-test was conducted to determine the variation within the population. The 50th percentiles of some of the anthropometric parameters were compared with those from other populations in literature. The correlation between the worker’s age and the body anthropometry was also investigated.The mean weight, height, shoulder height (sitting), eye height (standing) and eye height (sitting) are 63.37 kg, 1.57 m, 0.55 m, 1.45 m, and 0.67 m respectively.Result also shows a high correlation with other populations and a statistically significant difference in variability of data within the population in all the body dimensions measured. With a mean age of 42.36 years, results shows that age will be a wrong indicator for estimating the anthropometry for the population.

Keywords: anthropometry, cassava processing, design to fit, gari-frying, workstation design

Procedia PDF Downloads 249
1508 Marble Powder’s Effect on Permeability and Mechanical Properties of Concrete

Authors: Shams Ul Khaliq, Khan Shahzada, Bashir Alam, Fawad Bilal, Mushtaq Zeb, Faizan Akbar

Abstract:

Marble industry contributes its fair share in environmental deterioration, producing voluminous amounts of mud and other excess residues obtained from marble and granite processing, polluting soil, water and air. Reusing these products in other products will not just prevent our environment from polluting but also help with economy. In this research, an attempt has been made to study the expediency of waste Marble Powder (MP) in concrete production. Various laboratory tests were performed to investigate permeability, physical and mechanical properties, such as slump, compressive strength, split tensile test, etc. Concrete test samples were fabricated with varying MP content (replacing 5-30% cement), furnished from two different sources. 5% replacement of marble dust caused 6% and 12% decrease in compressive and tensile strength respectively. These parameters gradually decreased with increasing MP content up to 30%. Most optimum results were obtained with 10% replacement. Improvement in consistency and permeability were noticed. The permeability was improved with increasing MP proportion up to 10% without substantial decrease in compressive strength. Obtained results revealed that MP as an alternative to cement in concrete production is a viable option considering its economic and environment friendly implications.

Keywords: marble powder, strength, permeability, consistency, environment

Procedia PDF Downloads 324
1507 Mobile Devices and E-Learning Systems as a Cost-Effective Alternative for Digitizing Paper Quizzes and Questionnaires in Social Work

Authors: K. Myška, L. Pilařová

Abstract:

The article deals with possibilities of using cheap mobile devices with the combination of free or open source software tools as an alternative to professional hardware and software equipment. Especially in social work, it is important to find cheap yet functional solution that can compete with complex but expensive solutions for digitizing paper materials. Our research was focused on the analysis of cheap and affordable solutions for digitizing the most frequently used paper materials that are being commonly used by terrain workers in social work. We used comparative analysis as a research method. Social workers need to process data from paper forms quite often. It is still more affordable, time and cost-effective to use paper forms to get feedback in many cases. Collecting data from paper quizzes and questionnaires can be done with the help of professional scanners and software. These technologies are very powerful and have advanced options for digitizing and processing digitized data, but are also very expensive. According to results of our study, the combination of open source software and mobile phone or cheap scanner can be considered as a cost-effective alternative to professional equipment.

Keywords: digitalization, e-learning, mobile devices, questionnaire

Procedia PDF Downloads 148
1506 Association of Sensory Processing and Cognitive Deficits in Children with Autism Spectrum Disorders – Pioneer Study in Saudi Arabia

Authors: Rana Zeina

Abstract:

Objective: The association between Sensory problems and cognitive abilities has been studied in individuals with Autism Spectrum Disorders (ASDs). In this study, we used a neuropsychological test to evaluate memory and attention in ASDs children with sensory problems compared to the ASDs children without sensory problems. Methods: Four visual memory tests of Cambridge Neuropsychological Test Automated Battery (CANTAB) including Big/Little Circle (BLC), Simple Reaction Time (SRT), Intra/Extra Dimensional Set Shift (IED), Spatial Recognition Memory (SRM), were administered to 14 ASDs children with sensory problems compared to 13 ASDs without sensory problems aged 3 to 12 with IQ of above 70. Results: ASDs Individuals with sensory problems performed worse than the ASDs group without sensory problems on comprehension, learning, reversal and simple reaction time tasks, and no significant difference between the two groups was recorded in terms of the visual memory and visual comprehension tasks. Conclusion: The findings of this study suggest that ASDs children with sensory problems are facing deficits in learning, comprehension, reversal, and speed of response to stimuli.

Keywords: visual memory, attention, autism spectrum disorders, CANTAB eclipse

Procedia PDF Downloads 446
1505 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices

Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner

Abstract:

Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.

Keywords: biometrics, electrocardiographic, machine learning, signals processing

Procedia PDF Downloads 137
1504 Indoor Real-Time Positioning and Mapping Based on Manhattan Hypothesis Optimization

Authors: Linhang Zhu, Hongyu Zhu, Jiahe Liu

Abstract:

This paper investigated a method of indoor real-time positioning and mapping based on the Manhattan world assumption. In indoor environments, relying solely on feature matching techniques or other geometric algorithms for sensor pose estimation inevitably resulted in cumulative errors, posing a significant challenge to indoor positioning. To address this issue, we adopt the Manhattan world hypothesis to optimize the camera pose algorithm based on feature matching, which improves the accuracy of camera pose estimation. A special processing method was applied to image data frames that conformed to the Manhattan world assumption. When similar data frames appeared subsequently, this could be used to eliminate drift in sensor pose estimation, thereby reducing cumulative errors in estimation and optimizing mapping and positioning. Through experimental verification, it is found that our method achieves high-precision real-time positioning in indoor environments and successfully generates maps of indoor environments. This provides effective technical support for applications such as indoor navigation and robot control.

Keywords: Manhattan world hypothesis, real-time positioning and mapping, feature matching, loopback detection

Procedia PDF Downloads 56
1503 Finite Element Analysis of Human Tarsals, Meta Tarsals and Phalanges for Predicting probable location of Fractures

Authors: Irfan Anjum Manarvi, Fawzi Aljassir

Abstract:

Human bones have been a keen area of research over a long time in the field of biomechanical engineering. Medical professionals, as well as engineering academics and researchers, have investigated various bones by using medical, mechanical, and materials approaches to discover the available body of knowledge. Their major focus has been to establish properties of these and ultimately develop processes and tools either to prevent fracture or recover its damage. Literature shows that mechanical professionals conducted a variety of tests for hardness, deformation, and strain field measurement to arrive at their findings. However, they considered these results accuracy to be insufficient due to various limitations of tools, test equipment, difficulties in the availability of human bones. They proposed the need for further studies to first overcome inaccuracies in measurement methods, testing machines, and experimental errors and then carry out experimental or theoretical studies. Finite Element analysis is a technique which was developed for the aerospace industry due to the complexity of design and materials. But over a period of time, it has found its applications in many other industries due to accuracy and flexibility in selection of materials and types of loading that could be theoretically applied to an object under study. In the past few decades, the field of biomechanical engineering has also started to see its applicability. However, the work done in the area of Tarsals, metatarsals and phalanges using this technique is very limited. Therefore, present research has been focused on using this technique for analysis of these critical bones of the human body. This technique requires a 3-dimensional geometric computer model of the object to be analyzed. In the present research, a 3d laser scanner was used for accurate geometric scans of individual tarsals, metatarsals, and phalanges from a typical human foot to make these computer geometric models. These were then imported into a Finite Element Analysis software and a length refining process was carried out prior to analysis to ensure the computer models were true representatives of actual bone. This was followed by analysis of each bone individually. A number of constraints and load conditions were applied to observe the stress and strain distributions in these bones under the conditions of compression and tensile loads or their combination. Results were collected for deformations in various axis, and stress and strain distributions were observed to identify critical locations where fracture could occur. A comparative analysis of failure properties of all the three types of bones was carried out to establish which of these could fail earlier which is presented in this research. Results of this investigation could be used for further experimental studies by the academics and researchers, as well as industrial engineers, for development of various foot protection devices or tools for surgical operations and recovery treatment of these bones. Researchers could build up on these models to carryout analysis of a complete human foot through Finite Element analysis under various loading conditions such as walking, marching, running, and landing after a jump etc.

Keywords: tarsals, metatarsals, phalanges, 3D scanning, finite element analysis

Procedia PDF Downloads 326
1502 A Hybrid Expert System for Generating Stock Trading Signals

Authors: Hosein Hamisheh Bahar, Mohammad Hossein Fazel Zarandi, Akbar Esfahanipour

Abstract:

In this paper, a hybrid expert system is developed by using fuzzy genetic network programming with reinforcement learning (GNP-RL). In this system, the frame-based structure of the system uses the trading rules extracted by GNP. These rules are extracted by using technical indices of the stock prices in the training time period. For developing this system, we applied fuzzy node transition and decision making in both processing and judgment nodes of GNP-RL. Consequently, using these method not only did increase the accuracy of node transition and decision making in GNP's nodes, but also extended the GNP's binary signals to ternary trading signals. In the other words, in our proposed Fuzzy GNP-RL model, a No Trade signal is added to conventional Buy or Sell signals. Finally, the obtained rules are used in a frame-based system implemented in Kappa-PC software. This developed trading system has been used to generate trading signals for ten companies listed in Tehran Stock Exchange (TSE). The simulation results in the testing time period shows that the developed system has more favorable performance in comparison with the Buy and Hold strategy.

Keywords: fuzzy genetic network programming, hybrid expert system, technical trading signal, Tehran stock exchange

Procedia PDF Downloads 328
1501 Self-Attention Mechanism for Target Hiding Based on Satellite Images

Authors: Hao Yuan, Yongjian Shen, Xiangjun He, Yuheng Li, Zhouzhou Zhang, Pengyu Zhang, Minkang Cai

Abstract:

Remote sensing data can provide support for decision-making in disaster assessment or disaster relief. The traditional processing methods of sensitive targets in remote sensing mapping are mainly based on manual retrieval and image editing tools, which are inefficient. Methods based on deep learning for sensitive target hiding are faster and more flexible. But these methods have disadvantages in training time and cost of calculation. This paper proposed a target hiding model Self Attention (SA) Deepfill, which used self-attention modules to replace part of gated convolution layers in image inpainting. By this operation, the calculation amount of the model becomes smaller, and the performance is improved. And this paper adds free-form masks to the model’s training to enhance the model’s universal. The experiment on an open remote sensing dataset proved the efficiency of our method. Moreover, through experimental comparison, the proposed method can train for a longer time without over-fitting. Finally, compared with the existing methods, the proposed model has lower computational weight and better performance.

Keywords: remote sensing mapping, image inpainting, self-attention mechanism, target hiding

Procedia PDF Downloads 124
1500 Experimental Characterization of Composite Material with Non Contacting Methods

Authors: Nikolaos Papadakis, Constantinos Condaxakis, Konstantinos Savvakis

Abstract:

The aim of this paper is to determine the elastic properties (elastic modulus and Poisson ratio) of a composite material based on noncontacting imaging methods. More specifically, the significantly reduced cost of digital cameras has given the opportunity of the high reliability of low-cost strain measurement. The open source platform Ncorr is used in this paper which utilizes the method of digital image correlation (DIC). The use of digital image correlation in measuring strain uses random speckle preparation on the surface of the gauge area, image acquisition, and postprocessing the image correlation to obtain displacement and strain field on surface under study. This study discusses technical issues relating to the quality of results to be obtained are discussed. [0]8 fabric glass/epoxy composites specimens were prepared and tested at different orientations 0[o], 30[o], 45[o], 60[o], 90[o]. Each test was recorded with the camera at a constant frame rate and constant lighting conditions. The recorded images were processed through the use of the image processing software. The parameters of the test are reported. The strain map output which is obtained through strain measurement using Ncorr is validated by a) comparing the elastic properties with expected values from Classical laminate theory, b) through finite element analysis.

Keywords: composites, Ncorr, strain map, videoextensometry

Procedia PDF Downloads 139
1499 Extraction of Essential Oil and Pectin from Lime and Waste Technology Development

Authors: Wilaisri Limphapayom

Abstract:

Lime is one of the economically important produced in Thailand. The objective of this research is to increase utilization in food and cosmetic. Extraction of essential oil and pectin from lime (Citrus aurantifolia (Christm & Panz ) Swing) have been studied. Extraction of essential oil has been made by using hydro-distillation .The essential oil ranged from 1.72-2.20%. The chemical composition of essential oil composed of alpha-pinene , beta-pinene , D-limonene , comphene , a-phellandrene , g-terpinene , a-ocimene , O-cymene , 2-carene , Linalool , trans-ocimenol , Geraniol , Citral , Isogeraniol , Verbinol , and others when analyzed by using GC-MS method. Pectin extraction from lime waste , boiled water after essential oil extraction. Pectin extraction were found 40.11-65.81 g /100g of lime peel. The best extraction condition was found to be higher in yield by using ethanol extraction. The potential of this study had satisfactory results to improve lime processing system for value-added . The present study was also focused on Lime powder production as source of vitamin C or ascorbic acid and the potential of lime waste as a source of essential oil and pectin. Lime powder produced from Spray Dryer . Lime juice with 2 different level of maltodextrins DE 10 , 30 and 50% w/w was sprayed at 150 degrees celsius inlet air temperature and at 90-degree celsius outlet temperature. Lime powder with 50% maltodextrin gave the most desirable quality product. This product has vitamin C contents of 25 mg/100g (w/w).

Keywords: extraction, pectin, essential oil, lime

Procedia PDF Downloads 292
1498 Network Word Discovery Framework Based on Sentence Semantic Vector Similarity

Authors: Ganfeng Yu, Yuefeng Ma, Shanliang Yang

Abstract:

The word discovery is a key problem in text information retrieval technology. Methods in new word discovery tend to be closely related to words because they generally obtain new word results by analyzing words. With the popularity of social networks, individual netizens and online self-media have generated various network texts for the convenience of online life, including network words that are far from standard Chinese expression. How detect network words is one of the important goals in the field of text information retrieval today. In this paper, we integrate the word embedding model and clustering methods to propose a network word discovery framework based on sentence semantic similarity (S³-NWD) to detect network words effectively from the corpus. This framework constructs sentence semantic vectors through a distributed representation model, uses the similarity of sentence semantic vectors to determine the semantic relationship between sentences, and finally realizes network word discovery by the meaning of semantic replacement between sentences. The experiment verifies that the framework not only completes the rapid discovery of network words but also realizes the standard word meaning of the discovery of network words, which reflects the effectiveness of our work.

Keywords: text information retrieval, natural language processing, new word discovery, information extraction

Procedia PDF Downloads 88
1497 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 3: Volume Reduction and Stabilization of Solid Waste

Authors: Masaumi Nakahara, Sou Watanabe, Hiromichi Ogi, Atsuhiro Shibata, Kazunori Nomura

Abstract:

In the Japan Atomic Energy Agency, three types of experimental research, advanced reactor fuel reprocessing, radioactive waste disposal, and nuclear fuel cycle technology, have been carried out at the Chemical Processing Facility. The facility has generated high level radioactive liquid and solid wastes in hot cells. The high level radioactive solid waste is divided into three main categories, a flammable waste, a non-flammable waste, and a solid reagent waste. A plastic product is categorized into the flammable waste and molten with a heating mantle. The non-flammable waste is cut with a band saw machine for reducing the volume. Among the solid reagent waste, a used adsorbent after the experiments is heated, and an extractant is decomposed for its stabilization. All high level radioactive solid wastes in the hot cells are packed in a high level radioactive solid waste can. The high level radioactive solid waste can is transported to the 2nd High Active Solid Waste Storage in the Tokai Reprocessing Plant in the Japan Atomic Energy Agency.

Keywords: high level radioactive solid waste, advanced reactor fuel reprocessing, radioactive waste disposal, nuclear fuel cycle technology

Procedia PDF Downloads 150
1496 Examining the Extent and Magnitude of Food Security amongst Rural Farming Households in Nigeria

Authors: Ajibade T., Omotesho O. A., Ayinde O. E, Ajibade E. T., Muhammad-Lawal A.

Abstract:

This study was carried out to examine the extent and magnitude of food security amongst farming rural households in Nigeria. Data used for this study was collected from a total of two hundred and forty rural farming households using a two-stage random sampling technique. The main tools of analysis for this study include descriptive statistics and a constructed food security index using the identification and aggregation procedure. The headcount ratio in this study reveals that 71% of individuals in the study area were food secure with an average per capita calorie and protein availability of 4,213.92kcal and 99.98g respectively. The aggregated household daily calorie availability and daily protein availability per capita were 3,634.57kcal and 84.08g respectively which happens to be above the food security line of 2,470kcal and 65g used in this study. The food insecure households fell short of the minimum daily per capita calorie and protein requirement by 2.1% and 24.9%. The study revealed that the area is food insecure due to unequal distribution of the available food amongst the sampled population. The study recommends that the households should empower themselves financially in order to enhance their ability to afford the food during both on and off seasons. Also, processing and storage of farm produce should be enhanced in order to improve on availability throughout the year.

Keywords: farming household, food security, identification and aggregation, food security index

Procedia PDF Downloads 282
1495 Emotional Artificial Intelligence and the Right to Privacy

Authors: Emine Akar

Abstract:

The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.

Keywords: AI, privacy law, data protection, big data

Procedia PDF Downloads 83
1494 B Spline Finite Element Method for Drifted Space Fractional Tempered Diffusion Equation

Authors: Ayan Chakraborty, BV. Rathish Kumar

Abstract:

Off-late many models in viscoelasticity, signal processing or anomalous diffusion equations are formulated in fractional calculus. Tempered fractional calculus is the generalization of fractional calculus and in the last few years several important partial differential equations occurring in the different field of science have been reconsidered in this term like diffusion wave equations, Schr$\ddot{o}$dinger equation and so on. In the present paper, a time-dependent tempered fractional diffusion equation of order $\gamma \in (0,1)$ with forcing function is considered. Existence, uniqueness, stability, and regularity of the solution has been proved. Crank-Nicolson discretization is used in the time direction. B spline finite element approximation is implemented. Generally, B-splines basis are useful for representing the geometry of a finite element model, interfacing a finite element analysis program. By utilizing this technique a priori space-time estimate in finite element analysis has been derived and we proved that the convergent order is $\mathcal{O}(h²+T²)$ where $h$ is the space step size and $T$ is the time. A couple of numerical examples have been presented to confirm the accuracy of theoretical results. Finally, we conclude that the studied method is useful for solving tempered fractional diffusion equations.

Keywords: B-spline finite element, error estimates, Gronwall's lemma, stability, tempered fractional

Procedia PDF Downloads 183
1493 The Evolution of Amazon Alexa: From Voice Assistant to Smart Home Hub

Authors: Abrar Abuzaid, Maha Alaaeddine, Haya Alesayi

Abstract:

This project is centered around understanding the usage and impact of Alexa, Amazon's popular virtual assistant, in everyday life. Alexa, known for its integration into devices like Amazon Echo, offers functionalities such as voice interaction, media control, providing real-time information, and managing smart home devices. Our primary focus is to conduct a straightforward survey aimed at uncovering how people use Alexa in their daily routines. We plan to reach out to a wide range of individuals to get a diverse perspective on how Alexa is being utilized for various tasks, the frequency and context of its use, and the overall user experience. The survey will explore the most common uses of Alexa, its impact on daily life, features that users find most beneficial, and improvements they are looking for. This project is not just about collecting data but also about understanding the real-world applications of a technology like Alexa and how it fits into different lifestyles. By examining the responses, we aim to gain a practical understanding of Alexa's role in homes and possibly in workplaces. This project will provide insights into user satisfaction and areas where Alexa could be enhanced to meet the evolving needs of its users. It’s a step towards connecting technology with everyday life, making it more accessible and user-friendly

Keywords: Amazon Alexa, artificial intelligence, smart speaker, natural language processing

Procedia PDF Downloads 53
1492 Comparing UV-based and O₃-Based AOPs for Removal of Emerging Contaminants from Food Processing Digestate Sludge

Authors: N. Moradi, C. M. Lopez-Vazquez, H. Garcia Hernandez, F. Rubio Rincon, D. Brdanovic, Mark van Loosdrecht

Abstract:

Advanced oxidation processes have been widely used for disinfection, removal of residual organic material, and for the removal of emerging contaminants from drinking water and wastewater. Yet, the application of these technologies to sludge treatment processes has not gained enough attention, mostly, considering the complexity of the sludge matrix. In this research, ozone and UV/H₂O₂ treatment were applied for the removal of emerging contaminants from a digestate supernatant. The removal of the following compounds was assessed:(i) salicylic acid (SA) (a surrogate of non-stradiol anti-inflammatory drugs (NSAIDs)), and (ii) sulfamethoxazole (SMX), sulfamethazine (SMN), and tetracycline (TCN) (the most frequent human and animal antibiotics). The ozone treatment was carried out in a plexiglass bubble column reactor with a capacity of 2.7 L; the system was equipped with a stirrer and a gas diffuser. The UV and UV/H₂O₂ treatments were done using a LED set-up (PearlLab beam device) dosing H₂O₂. In the ozone treatment evaluations, 95 % of the three antibiotics were removed during the first 20 min of exposure time, while an SA removal of 91 % occurred after 8 hours of exposure time. In the UV treatment evaluations, when adding the optimum dose of hydrogen peroxide (H₂O₂:COD molar ratio of 0.634), 36% of SA, 82% of TCN, and more than 90 % of both SMX and SMN were removed after 8 hours of exposure time. This study concluded that O₃ was more effective than UV/H₂O₂ in removing emerging contaminants from the digestate supernatant.

Keywords: digestate sludge, emerging contaminants, ozone, UV-AOP

Procedia PDF Downloads 96
1491 Morphology Operation and Discrete Wavelet Transform for Blood Vessels Segmentation in Retina Fundus

Authors: Rita Magdalena, N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Sofia Saidah, Bima Sakti

Abstract:

Vessel segmentation of retinal fundus is important for biomedical sciences in diagnosing ailments related to the eye. Segmentation can simplify medical experts in diagnosing retinal fundus image state. Therefore, in this study, we designed a software using MATLAB which enables the segmentation of the retinal blood vessels on retinal fundus images. There are two main steps in the process of segmentation. The first step is image preprocessing that aims to improve the quality of the image to be optimum segmented. The second step is the image segmentation in order to perform the extraction process to retrieve the retina’s blood vessel from the eye fundus image. The image segmentation methods that will be analyzed in this study are Morphology Operation, Discrete Wavelet Transform and combination of both. The amount of data that used in this project is 40 for the retinal image and 40 for manually segmentation image. After doing some testing scenarios, the average accuracy for Morphology Operation method is 88.46 % while for Discrete Wavelet Transform is 89.28 %. By combining the two methods mentioned in later, the average accuracy was increased to 89.53 %. The result of this study is an image processing system that can segment the blood vessels in retinal fundus with high accuracy and low computation time.

Keywords: discrete wavelet transform, fundus retina, morphology operation, segmentation, vessel

Procedia PDF Downloads 189
1490 Modeling Bessel Beams and Their Discrete Superpositions from the Generalized Lorenz-Mie Theory to Calculate Optical Forces over Spherical Dielectric Particles

Authors: Leonardo A. Ambrosio, Carlos. H. Silva Santos, Ivan E. L. Rodrigues, Ayumi K. de Campos, Leandro A. Machado

Abstract:

In this work, we propose an algorithm developed under Python language for the modeling of ordinary scalar Bessel beams and their discrete superpositions and subsequent calculation of optical forces exerted over dielectric spherical particles. The mathematical formalism, based on the generalized Lorenz-Mie theory, is implemented in Python for its large number of free mathematical (as SciPy and NumPy), data visualization (Matplotlib and PyJamas) and multiprocessing libraries. We also propose an approach, provided by a synchronized Software as Service (SaaS) in cloud computing, to develop a user interface embedded on a mobile application, thus providing users with the necessary means to easily introduce desired unknowns and parameters and see the graphical outcomes of the simulations right at their mobile devices. Initially proposed as a free Android-based application, such an App enables data post-processing in cloud-based architectures and visualization of results, figures and numerical tables.

Keywords: Bessel Beams and Frozen Waves, Generalized Lorenz-Mie Theory, Numerical Methods, optical forces

Procedia PDF Downloads 375
1489 Effect of Thermal Pretreatment on Functional Properties of Chicken Protein Hydrolysate

Authors: Nutnicha Wongpadungkiat, Suwit Siriwatanayotin, Aluck Thipayarat, Punchira Vongsawasdi, Chotika Viriyarattanasak

Abstract:

Chicken products are major export product of Thailand. With a dramatically increasing consumption of chicken product in the world, there are abundant wastes from chicken meat processing industry. Recently, much research in the development of value-added products from chicken meat industry has focused on the production of protein hydrolysate, utilized as food ingredients for human diet and animal feed. The present study aimed to determine the effect of thermal pre-treatment on functional properties of chicken protein hydrolysate. Chicken breasts were heated at 40, 60, 80 and 100ºC prior to hydrolysis by Alcalase at 60ºC, pH 8 for 4 hr. The hydrolysate was freeze-dried, and subsequently used for assessment of its functional properties molecular weight by gel electrophoresis (SDS-PAGE). The obtained results show that increasing the pre-treatment temperature increased oil holding capacity and emulsion stability while decreasing antioxidant activity and water holding capacity. The SDS-PAGE analysis showed the evidence of protein aggregation in the hydrolysate treated at the higher pre-treatment temperature. These results suggest the connection between molecular weight of the hydrolysate and its functional properties.

Keywords: chicken protein hydrolysate, enzymatic hydrolysis, thermal pretreatment, functional properties

Procedia PDF Downloads 262
1488 General Time-Dependent Sequenced Route Queries in Road Networks

Authors: Mohammad Hossein Ahmadi, Vahid Haghighatdoost

Abstract:

Spatial databases have been an active area of research over years. In this paper, we study how to answer the General Time-Dependent Sequenced Route queries. Given the origin and destination of a user over a time-dependent road network graph, an ordered list of categories of interests and a departure time interval, our goal is to find the minimum travel time path along with the best departure time that minimizes the total travel time from the source location to the given destination passing through a sequence of points of interests belonging to each of the specified categories of interest. The challenge of this problem is the added complexity to the optimal sequenced route queries, where we assume that first the road network is time dependent, and secondly the user defines a departure time interval instead of one single departure time instance. For processing general time-dependent sequenced route queries, we propose two solutions as Discrete-Time and Continuous-Time Sequenced Route approaches, finding approximate and exact solutions, respectively. Our proposed approaches traverse the road network based on A*-search paradigm equipped with an efficient heuristic function, for shrinking the search space. Extensive experiments are conducted to verify the efficiency of our proposed approaches.

Keywords: trip planning, time dependent, sequenced route query, road networks

Procedia PDF Downloads 316