Search results for: algebraic signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5166

Search results for: algebraic signal processing

3186 Molecular Approach for the Detection of Lactic Acid Bacteria in the Kenyan Spontaneously Fermented Milk, Mursik

Authors: John Masani Nduko, Joseph Wafula Matofari

Abstract:

Many spontaneously fermented milk products are produced in Kenya, where they are integral to the human diet and play a central role in enhancing food security and income generation via small-scale enterprises. Fermentation enhances product properties such as taste, aroma, shelf-life, safety, texture, and nutritional value. Some of these products have demonstrated therapeutic and probiotic effects although recent reports have linked some to death, biotoxin infections, and esophageal cancer. These products are mostly processed from poor quality raw materials under unhygienic conditions resulting to inconsistent product quality and limited shelf-lives. Though very popular, research on their processing technologies is low, and none of the products has been produced under controlled conditions using starter cultures. To modernize the processing technologies for these products, our study aims at describing the microbiology and biochemistry of a representative Kenyan spontaneously fermented milk product, Mursik using modern biotechnology (DNA sequencing) and their chemical composition. Moreover, co-creation processes reflecting stakeholders’ experiences on traditional fermented milk production technologies and utilization, ideals and senses of value, which will allow the generation of products based on common ground for rapid progress will be discussed. Knowledge of the value of clean starting raw material will be emphasized, the need for the definition of fermentation parameters highlighted, and standard equipment employment to attain controlled fermentation discussed. This presentation will review the available information regarding traditional fermented milk (Mursik) and highlight our current research work on the application of molecular approaches (metagenomics) for the valorization of Mursik production process through starter culture/ probiotic strains isolation and identification, and quality and safety aspects of the product. The importance of the research and future research areas on the same subject will also be highlighted.

Keywords: lactic acid bacteria, high throughput biotechnology, spontaneous fermentation, Mursik

Procedia PDF Downloads 296
3185 Reduction of High-Frequency Planar Transformer Conduction Losses Using a Planar Litz Wire Structure

Authors: Hamed Belloumi, Amira Zouaoui, Ferid kourda

Abstract:

A new trend in power converters is to design planar transformer that aim for low profile. However, at high frequency, the planar transformer ac losses become significant due to the proximity and skin effects. In this paper, the design and implementation of a novel planar Litz conductor is presented in order to equalize the flux linkage and improving the current distribution. The developed PCB litz wire structure minimizes the losses in a similar way to the conventional multi stranded Litz wires. In order to further illustrate the eddy current effect in different arrangements, a Finite-Element Analysis (FEA) tool is used to analyze current distribution inside the conductors. Finally, the proposed planar transformer has been integrated in an electronic stage to test at high signal levels.

Keywords: planar transformer, finite-element analysis, winding losses, planar Litz wire

Procedia PDF Downloads 408
3184 Geological Structure Identification in Semilir Formation: An Correlated Geological and Geophysical (Very Low Frequency) Data for Zonation Disaster with Current Density Parameters and Geological Surface Information

Authors: E. M. Rifqi Wilda Pradana, Bagus Bayu Prabowo, Meida Riski Pujiyati, Efraim Maykhel Hagana Ginting, Virgiawan Arya Hangga Reksa

Abstract:

The VLF (Very Low Frequency) method is an electromagnetic method that uses low frequencies between 10-30 KHz which results in a fairly deep penetration. In this study, the VLF method was used for zonation of disaster-prone areas by identifying geological structures in the form of faults. Data acquisition was carried out in Trimulyo Region, Jetis District, Bantul Regency, Special Region of Yogyakarta, Indonesia with 8 measurement paths. This study uses wave transmitters from Japan and Australia to obtain Tilt and Elipt values that can be used to create RAE (Rapat Arus Ekuivalen or Current Density) sections that can be used to identify areas that are easily crossed by electric current. This section will indicate the existence of a geological structure in the form of faults in the study area which is characterized by a high RAE value. In data processing of VLF method, it is obtained Tilt vs Elliptical graph and Moving Average (MA) Tilt vs Moving Average (MA) Elipt graph of each path that shows a fluctuating pattern and does not show any intersection at all. Data processing uses Matlab software and obtained areas with low RAE values that are 0%-6% which shows medium with low conductivity and high resistivity and can be interpreted as sandstone, claystone, and tuff lithology which is part of the Semilir Formation. Whereas a high RAE value of 10% -16% which shows a medium with high conductivity and low resistivity can be interpreted as a fault zone filled with fluid. The existence of the fault zone is strengthened by the discovery of a normal fault on the surface with strike N550W and dip 630E at coordinates X= 433256 and Y= 9127722 so that the activities of residents in the zone such as housing, mining activities and other activities can be avoided to reduce the risk of natural disasters.

Keywords: current density, faults, very low frequency, zonation

Procedia PDF Downloads 176
3183 Design of a Sliding Mode Control Using Nonlinear Sliding Surface and Nonlinear Observer Applied to the Trirotor Mini-Aircraft

Authors: Samir Zeghlache, Abderrahmen Bouguerra, Kamel Kara, Djamel Saigaa

Abstract:

The control of the trirotor helicopter includes nonlinearities, uncertainties and external perturbations that should be considered in the design of control laws. This paper presents a control strategy for an underactuated six degrees of freedom (6 DOF) trirotor helicopter, based on the coupling of the fuzzy logic control and sliding mode control (SMC). The main purpose of this work is to eliminate the chattering phenomenon. To achieve our purpose we have used a fuzzy logic control to generate the hitting control signal, also the non linear observer is then synthesized in order to estimate the unmeasured states. Finally simulation results are included to indicate the trirotor UAV with the proposed controller can greatly alleviate the chattering effect and remain robust to the external disturbances.

Keywords: fuzzy sliding mode control, trirotor helicopter, dynamic modelling, underactuated systems

Procedia PDF Downloads 537
3182 Analysis of Formation Methods of Range Profiles for an X-Band Coastal Surveillance Radar

Authors: Nguyen Van Loi, Le Thanh Son, Tran Trung Kien

Abstract:

The paper deals with the problem of the formation of range profiles (RPs) for an X-band coastal surveillance radar. Two popular methods, the difference operator method, and the window-based method, are reviewed and analyzed via two tests with different datasets. The test results show that although the original window-based method achieves a better performance than the difference operator method, it has three main drawbacks that are the use of 3 or 4 peaks of an RP for creating the windows, the extension of the window size using the power sum of three adjacent cells in the left and the right sides of the windows and the same threshold applied for all types of vessels to finish the formation process of RPs. These drawbacks lead to inaccurate RPs due to the low signal-to-clutter ratio. Therefore, some suggestions are proposed to improve the original window-based method.

Keywords: range profile, difference operator method, window-based method, automatic target recognition

Procedia PDF Downloads 128
3181 Field Saturation Flow Measurement Using Dynamic Passenger Car Unit under Mixed Traffic Condition

Authors: Ramesh Chandra Majhi

Abstract:

Saturation flow is a very important input variable for the design of signalized intersections. Saturation flow measurement is well established for homogeneous traffic. However, saturation flow measurement and modeling is a challenging task in heterogeneous characterized by multiple vehicle types and non-lane based movement. Present study focuses on proposing a field procedure for Saturation flow measurement and the effect of typical mixed traffic behavior at the signal as far as non-lane based traffic movement is concerned. Data collected during peak and off-peak hour from five intersections with varying approach width is used for validating the saturation flow model. The insights from the study can be used for modeling saturation flow and delay at signalized intersection in heterogeneous traffic conditions.

Keywords: optimization, passenger car unit, saturation flow, signalized intersection

Procedia PDF Downloads 327
3180 Automatic Furrow Detection for Precision Agriculture

Authors: Manpreet Kaur, Cheol-Hong Min

Abstract:

The increasing advancement in the robotics equipped with machine vision sensors applied to precision agriculture is a demanding solution for various problems in the agricultural farms. An important issue related with the machine vision system concerns crop row and weed detection. This paper proposes an automatic furrow detection system based on real-time processing for identifying crop rows in maize fields in the presence of weed. This vision system is designed to be installed on the farming vehicles, that is, submitted to gyros, vibration and other undesired movements. The images are captured under image perspective, being affected by above undesired effects. The goal is to identify crop rows for vehicle navigation which includes weed removal, where weeds are identified as plants outside the crop rows. The images quality is affected by different lighting conditions and gaps along the crop rows due to lack of germination and wrong plantation. The proposed image processing method consists of four different processes. First, image segmentation based on HSV (Hue, Saturation, Value) decision tree. The proposed algorithm used HSV color space to discriminate crops, weeds and soil. The region of interest is defined by filtering each of the HSV channels between maximum and minimum threshold values. Then the noises in the images were eliminated by the means of hybrid median filter. Further, mathematical morphological processes, i.e., erosion to remove smaller objects followed by dilation to gradually enlarge the boundaries of regions of foreground pixels was applied. It enhances the image contrast. To accurately detect the position of crop rows, the region of interest is defined by creating a binary mask. The edge detection and Hough transform were applied to detect lines represented in polar coordinates and furrow directions as accumulations on the angle axis in the Hough space. The experimental results show that the method is effective.

Keywords: furrow detection, morphological, HSV, Hough transform

Procedia PDF Downloads 233
3179 Genetic Data of Deceased People: Solving the Gordian Knot

Authors: Inigo de Miguel Beriain

Abstract:

Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.

Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people

Procedia PDF Downloads 155
3178 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby

Authors: Jazim Sohail, Filipe Teixeira-Dias

Abstract:

Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.

Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI

Procedia PDF Downloads 218
3177 Radar Signal Detection Using Neural Networks in Log-Normal Clutter for Multiple Targets Situations

Authors: Boudemagh Naime

Abstract:

Automatic radar detection requires some methods of adapting to variations in the background clutter in order to control their false alarm rate. The problem becomes more complicated in non-Gaussian environment. In fact, the conventional approach in real time applications requires a complex statistical modeling and much computational operations. To overcome these constraints, we propose another approach based on artificial neural network (ANN-CMLD-CFAR) using a Back Propagation (BP) training algorithm. The considered environment follows a log-normal distribution in the presence of multiple Rayleigh-targets. To evaluate the performances of the considered detector, several situations, such as scale parameter and the number of interferes targets, have been investigated. The simulation results show that the ANN-CMLD-CFAR processor outperforms the conventional statistical one.

Keywords: radat detection, ANN-CMLD-CFAR, log-normal clutter, statistical modelling

Procedia PDF Downloads 367
3176 The Use of Haar Wavelet Mother Signal Tool for Performance Analysis Response of Distillation Column (Application to Moroccan Case Study)

Authors: Mahacine Amrani

Abstract:

This paper aims at reviewing some Moroccan industrial applications of wavelet especially in the dynamic identification of a process model using Haar wavelet mother response. Two recent Moroccan study cases are described using dynamic data originated by a distillation column and an industrial polyethylene process plant. The purpose of the wavelet scheme is to build on-line dynamic models. In both case studies, a comparison is carried out between the Haar wavelet mother response model and a linear difference equation model. Finally it concludes, on the base of the comparison of the process performances and the best responses, which may be useful to create an estimated on-line internal model control and its application towards model-predictive controllers (MPC). All calculations were implemented using AutoSignal Software.

Keywords: process performance, model, wavelets, Haar, Moroccan

Procedia PDF Downloads 320
3175 GPU Based Real-Time Floating Object Detection System

Authors: Jie Yang, Jian-Min Meng

Abstract:

A GPU-based floating object detection scheme is presented in this paper which is designed for floating mine detection tasks. This system uses contrast and motion information to eliminate as many false positives as possible while avoiding false negatives. The GPU computation platform is deployed to allow detecting objects in real-time. From the experimental results, it is shown that with certain configuration, the GPU-based scheme can speed up the computation up to one thousand times compared to the CPU-based scheme.

Keywords: object detection, GPU, motion estimation, parallel processing

Procedia PDF Downloads 477
3174 A Literature Review of Precision Agriculture: Applications of Diagnostic Diseases in Corn, Potato, and Rice Based on Artificial Intelligence

Authors: Carolina Zambrana, Grover Zurita

Abstract:

The food loss production that occurs in deficient agricultural production is one of the major problems worldwide. This puts the population's food security and the efficiency of farming investments at risk. It is to be expected that this food security will be achieved with the own and efficient production of each country. It will have an impact on the well-being of its population and, thus, also on food sovereignty. The production losses in quantity and quality occur due to the lack of efficient detection of diseases at an early stage. It is very difficult to solve the agriculture efficiency using traditional methods since it takes a long time to be carried out due to detection imprecision of the main diseases, especially when the production areas are extensive. Therefore, the main objective of this research study is to perform a systematic literature review, of the latest five years, of Precision Agriculture (PA) to be able to understand the state of the art of the set of new technologies, procedures, and optimization processes with Artificial Intelligence (AI). This study will focus on Corns, Potatoes, and Rice diagnostic diseases. The extensive literature review will be performed on Elsevier, Scopus, and IEEE databases. In addition, this research will focus on advanced digital imaging processing and the development of software and hardware for PA. The convolution neural network will be handling special attention due to its outstanding diagnostic results. Moreover, the studied data will be incorporated with artificial intelligence algorithms for the automatic diagnosis of crop quality. Finally, precision agriculture with technology applied to the agricultural sector allows the land to be exploited efficiently. This system requires sensors, drones, data acquisition cards, and global positioning systems. This research seeks to merge different areas of science, control engineering, electronics, digital image processing, and artificial intelligence for the development, in the near future, of a low-cost image measurement system that allows the optimization of crops with AI.

Keywords: precision agriculture, convolutional neural network, deep learning, artificial intelligence

Procedia PDF Downloads 81
3173 Experimental Demonstration of Broadband Erbium-Doped Fiber Amplifier

Authors: Belloui Bouzid

Abstract:

In this paper, broadband design of erbium doped fiber amplifier (EDFA) is demonstrated and proved experimentally. High and broad gain is covered in C and L bands. The used technique combines, in one configuration, two double passes with split band structure for the amplification of two traveled signals one for the C band and the other for L band. This new topology is to investigate the trends of high gain and wide amplification at different status of pumping power, input wavelength, and input signal power. The presented paper is to explore the performance of EDFA gain using what it can be called double pass double branch wide band amplification configuration. The obtained results show high gain and wide broadening range of 44.24 dB and 80 nm amplification respectively.

Keywords: erbium doped fiber amplifier, erbium doped fiber laser, optical amplification, fiber laser

Procedia PDF Downloads 257
3172 Comparative Investigation of Two Non-Contact Prototype Designs Based on a Squeeze-Film Levitation Approach

Authors: A. Almurshedi, M. Atherton, C. Mares, T. Stolarski, M. Miyatake

Abstract:

Transportation and handling of delicate and lightweight objects is currently a significant issue in some industries. Two common contactless movement prototype designs, ultrasonic transducer design and vibrating plate design, are compared. Both designs are based on the method of squeeze-film levitation, and this study aims to identify the limitations, and challenges of each. The designs are evaluated in terms of levitation capabilities, and characteristics. To this end, theoretical and experimental explorations are made. It is demonstrated that the ultrasonic transducer prototype design is better suited to the terms of levitation capabilities. However, the design has some operating and mechanical designing difficulties. For making accurate industrial products in micro-fabrication and nanotechnology contexts, such as semiconductor silicon wafers, micro-components and integrated circuits, non-contact oil-free, ultra-precision and low wear transport along the production line is crucial for enabling. One of the designs (design A) is called the ultrasonic chuck, for which an ultrasonic transducer (Langevin, FBI 28452 HS) comprises the main part. Whereas the other (design B), is a vibrating plate design, which consists of a plain rectangular plate made of Aluminium firmly fastened at both ends. The size of the rectangular plate is 200x100x2 mm. In addition, four rounded piezoelectric actuators of size 28 mm diameter with 0.5 mm thickness are glued to the underside of the plate. The vibrating plate is clamped at both ends in the horizontal plane through a steel supporting structure. In addition, the dynamic of levitation using the designs (A and B) has been investigated based on the squeeze film levitation (SFL). The input apparatus that is used with designs consist of a sine wave signal generator connected to an amplifier type ENP-1-1U (Echo Electronics). The latter has to be utilised to magnify the sine wave voltage that is produced by the signal generator. The measurements of the maximum levitation for three different semiconductor wafers of weights 52, 70 and 88 [g] for design A are 240, 205 and 187 [um], respectively. Whereas the physical results show that the average separation distance for a disk of 5 [g] weight for design B reaches 70 [um]. By using the methodology of squeeze film levitation, it is possible to hold an object in a non-contact manner. The analyses of the investigation outcomes signify that the non-contact levitation of design A provides more improvement than design B. However, design A is more complicated than design B in terms of its manufacturing. In order to identify an adequate non-contact SFL design, a comparison between two common such designs has been adopted for the current investigation. Specifically, the study will involve making comparisons in terms of the following issues: floating component geometries and material type constraints; final created pressure distributions; dangerous interactions with the surrounding space; working environment constraints; and complication and compactness of the mechanical design. Considering all these matters is essential for proficiently distinguish the better SFL design.

Keywords: ANSYS, floating, piezoelectric, squeeze-film

Procedia PDF Downloads 151
3171 Electrophoretic Light Scattering Based on Total Internal Reflection as a Promising Diagnostic Method

Authors: Ekaterina A. Savchenko, Elena N. Velichko, Evgenii T. Aksenov

Abstract:

The development of pathological processes, such as cardiovascular and oncological diseases, are accompanied by changes in molecular parameters in cells, tissues, and serum. The study of the behavior of protein molecules in solutions is of primarily importance for diagnosis of such diseases. Various physical and chemical methods are used to study molecular systems. With the advent of the laser and advances in electronics, optical methods, such as scanning electron microscopy, sedimentation analysis, nephelometry, static and dynamic light scattering, have become the most universal, informative and accurate tools for estimating the parameters of nanoscale objects. The electrophoretic light scattering is the most effective technique. It has a high potential in the study of biological solutions and their properties. This technique allows one to investigate the processes of aggregation and dissociation of different macromolecules and obtain information on their shapes, sizes and molecular weights. Electrophoretic light scattering is an analytical method for registration of the motion of microscopic particles under the influence of an electric field by means of quasi-elastic light scattering in a homogeneous solution with a subsequent registration of the spectral or correlation characteristics of the light scattered from a moving object. We modified the technique by using the regime of total internal reflection with the aim of increasing its sensitivity and reducing the volume of the sample to be investigated, which opens the prospects of automating simultaneous multiparameter measurements. In addition, the method of total internal reflection allows one to study biological fluids on the level of single molecules, which also makes it possible to increase the sensitivity and the informativeness of the results because the data obtained from an individual molecule is not averaged over an ensemble, which is important in the study of bimolecular fluids. To our best knowledge the study of electrophoretic light scattering in the regime of total internal reflection is proposed for the first time, latex microspheres 1 μm in size were used as test objects. In this study, the total internal reflection regime was realized on a quartz prism where the free electrophoresis regime was set. A semiconductor laser with a wavelength of 655 nm was used as a radiation source, and the light scattering signal was registered by a pin-diode. Then the signal from a photodetector was transmitted to a digital oscilloscope and to a computer. The autocorrelation functions and the fast Fourier transform in the regime of Brownian motion and under the action of the field were calculated to obtain the parameters of the object investigated. The main result of the study was the dependence of the autocorrelation function on the concentration of microspheres and the applied field magnitude. The effect of heating became more pronounced with increasing sample concentrations and electric field. The results obtained in our study demonstrated the applicability of the method for the examination of liquid solutions, including biological fluids.

Keywords: light scattering, electrophoretic light scattering, electrophoresis, total internal reflection

Procedia PDF Downloads 217
3170 Depolymerised Natural Polysaccharides Enhance the Production of Medicinal and Aromatic Plants and Their Active Constituents

Authors: M. Masroor Akhtar Khan, Moin Uddin, Lalit Varshney

Abstract:

Recently, there has been a rapidly expanding interest in finding applications of natural polymers in view of value addition to agriculture. It is now being realized that radiation processing of natural polysaccharides can be beneficially utilized either to improve the existing methodologies used for processing the natural polymers or to impart value addition to agriculture by converting them into more useful form. Gamma-ray irradiation is employed to degrade and lower the molecular weight of some of the natural polysaccharides like alginates, chitosan and carrageenan into small sized oligomers. When these oligomers are applied to plants as foliar sprays, they elicit various kinds of biological and physiological activities, including promotion of plant growth, seed germination, shoot elongation, root growth, flower production, suppression of heavy metal stress, etc. Furthermore, application of these oligomers can shorten the harvesting period of various crops and help in reducing the use of insecticides and chemical fertilizers. In recent years, the oligomers of sodium alginate obtained by irradiating the latter with gamma-rays at 520 kGy dose are being employed. It was noticed that the oligomers derived from the natural polysaccharides could induce growth, photosynthetic efficiency, enzyme activities and most importantly the production of secondary metabolite in the plants like Artemisia annua, Beta vulgaris, Catharanthus roseus, Chrysopogon zizanioides, Cymbopogon flexuosus, Eucalyptus citriodora, Foeniculum vulgare, Geranium sp., Mentha arvensis, Mentha citrata, Mentha piperita, Mentha virdis, Papaver somniferum and Trigonella foenum-graecum. As a result of the application of these oligomers, the yield and/or contents of the active constituents of the aforesaid plants were significantly enhanced. The productivity, as well as quality of medicinal and aromatic plants, may be ameliorated by this novel technique in an economical way as a very little quantity of these irradiated (depolymerised) polysaccharides is needed. Further, this is a very safe technique, as we did not expose the plants directly to radiation. The radiation was used to depolymerize the polysaccharides into oligomers.

Keywords: essential oil, medicinal and aromatic plants, plant production, radiation processed polysaccharides, active constituents

Procedia PDF Downloads 446
3169 Speed Control of Brushless DC Motor Using PI Controller in MATLAB Simulink

Authors: Do Chi Thanh, Dang Ngoc Huy

Abstract:

Nowadays, there are more and more variable speed drive systems in small-scale and large-scale applications such as the electric vehicle industry, household appliances, medical equipment, and other industrial fields led to the development of BLDC (Brushless DC) motors. BLDC drive has many advantages, such as higher efficiency, better speed torque characteristics, high power density, and low maintenance cost compared to other conventional motors. Most BLDC motors use a proportional-integral (PI) controller and a pulse width modulation (PWM) scheme for speed control. This article describes the simulation model of BLDC motor drive control with the help of MATLAB - SIMULINK simulation software. The built simulation model includes a BLDC motor dynamic block, Hall sensor signal generation block, inverter converter block, and PI controller.

Keywords: brushless DC motor, BLDC, six-step inverter, PI speed

Procedia PDF Downloads 76
3168 Analysis of Universal Mobile Telecommunications Service (UMTS) Planning Using High Altitude Platform Station (HAPS)

Authors: Yosika Dian Komala, Uke Kurniawan Usman, Yuyun Siti Rohmah

Abstract:

The enable technology fills up needs of high-speed data service is Universal Mobile Telecommunications Service (UMTS). UMTS has a data rate up to 2Mbps.UMTS terrestrial system has a coverage area about 1-2km. High Altitude Platform Station (HAPS) can be built by a macro cell that is able to serve the wider area. Design method of UMTS using HAPS is planning base on coverage and capacity. The planning method is simulated with 2.8.1 Atoll’s software. Determination of radius of the cell based on the coverage uses free space loss propagation model. While the capacity planning to determine the average cell through put is available with the Offered Bit Quantity (OBQ).

Keywords: UMTS, HAPS, coverage planning, capacity planning, signal level, Ec/Io, overlapping zone, throughput

Procedia PDF Downloads 641
3167 Study of the Design and Simulation Work for an Artificial Heart

Authors: Mohammed Eltayeb Salih Elamin

Abstract:

This study discusses the concept of the artificial heart using engineering concepts, of the fluid mechanics and the characteristics of the non-Newtonian fluid. For the purpose to serve heart patients and improve aspects of their lives and since the Statistics review according to world health organization (WHO) says that heart disease and blood vessels are the first cause of death in the world. Statistics shows that 30% of the death cases in the world by the heart disease, so simply we can consider it as the number one leading cause of death in the entire world is heart failure. And since the heart implantation become a very difficult and not always available, the idea of the artificial heart become very essential. So it’s important that we participate in the developing this idea by searching and finding the weakness point in the earlier designs and hoping for improving it for the best of humanity. In this study a pump was designed in order to pump blood to the human body and taking into account all the factors that allows it to replace the human heart, in order to work at the same characteristics and the efficiency of the human heart. The pump was designed on the idea of the diaphragm pump. Three models of blood obtained from the blood real characteristics and all of these models were simulated in order to study the effect of the pumping work on the fluid. After that, we study the properties of this pump by using Ansys15 software to simulate blood flow inside the pump and the amount of stress that it will go under. The 3D geometries modeling was done using SOLID WORKS and the geometries then imported to Ansys design modeler which is used during the pre-processing procedure. The solver used throughout the study is Ansys FLUENT. This is a tool used to analysis the fluid flow troubles and the general well-known term used for this branch of science is known as Computational Fluid Dynamics (CFD). Basically, Design Modeler used during the pre-processing procedure which is a crucial step before the start of the fluid flow problem. Some of the key operations are the geometry creations which specify the domain of the fluid flow problem. Next is mesh generation which means discretization of the domain to solve governing equations at each cell and later, specify the boundary zones to apply boundary conditions for the problem. Finally, the pre–processed work will be saved at the Ansys workbench for future work continuation.

Keywords: Artificial heart, computational fluid dynamic heart chamber, design, pump

Procedia PDF Downloads 461
3166 Fuzzy Logic Control for Flexible Joint Manipulator: An Experimental Implementation

Authors: Sophia Fry, Mahir Irtiza, Alexa Hoffman, Yousef Sardahi

Abstract:

This study presents an intelligent control algorithm for a flexible robotic arm. Fuzzy control is used to control the motion of the arm to maintain the arm tip at the desired position while reducing vibration and increasing the system speed of response. The Fuzzy controller (FC) is based on adding the tip angular position to the arm deflection angle and using their sum as a feedback signal to the control algorithm. This reduces the complexity of the FC in terms of the input variables, number of membership functions, fuzzy rules, and control structure. Also, the design of the fuzzy controller is model-free and uses only our knowledge about the system. To show the efficacy of the FC, the control algorithm is implemented on the flexible joint manipulator (FJM) developed by Quanser. The results show that the proposed control method is effective in terms of response time, overshoot, and vibration amplitude.

Keywords: fuzzy logic control, model-free control, flexible joint manipulators, nonlinear control

Procedia PDF Downloads 124
3165 An Electrochemical DNA Biosensor Based on Oracet Blue as a Label for Detection of Helicobacter pylori

Authors: Saeedeh Hajihosseini, Zahra Aghili, Navid Nasirizadeh

Abstract:

An innovative method of a DNA electrochemical biosensor based on Oracet Blue (OB) as an electroactive label and gold electrode (AuE) for detection of Helicobacter pylori, was offered. A single–stranded DNA probe with a thiol modification was covalently immobilized on the surface of the AuE by forming an Au–S bond. Differential pulse voltammetry (DPV) was used to monitor DNA hybridization by measuring the electrochemical signals of reduction of the OB binding to double– stranded DNA (ds–DNA). Our results showed that OB–based DNA biosensor has a decent potential for detection of single–base mismatch in target DNA. Selectivity of the proposed DNA biosensor was further confirmed in the presence of non–complementary and complementary DNA strands. Under optimum conditions, the electrochemical signal had a linear relationship with the concentration of the target DNA ranging from 0.3 nmol L-1 to 240.0 nmol L-1, and the detection limit was 0.17 nmol L-1, whit a promising reproducibility and repeatability.

Keywords: DNA biosensor, oracet blue, Helicobacter pylori, electrode (AuE)

Procedia PDF Downloads 268
3164 Problems of Boolean Reasoning Based Biclustering Parallelization

Authors: Marcin Michalak

Abstract:

Biclustering is the way of two-dimensional data analysis. For several years it became possible to express such issue in terms of Boolean reasoning, for processing continuous, discrete and binary data. The mathematical backgrounds of such approach — proved ability of induction of exact and inclusion–maximal biclusters fulfilling assumed criteria — are strong advantages of the method. Unfortunately, the core of the method has quite high computational complexity. In the paper the basics of Boolean reasoning approach for biclustering are presented. In such context the problems of computation parallelization are risen.

Keywords: Boolean reasoning, biclustering, parallelization, prime implicant

Procedia PDF Downloads 126
3163 Gaussian Mixture Model Based Identification of Arterial Wall Movement for Computation of Distension Waveform

Authors: Ravindra B. Patil, P. Krishnamoorthy, Shriram Sethuraman

Abstract:

This work proposes a novel Gaussian Mixture Model (GMM) based approach for accurate tracking of the arterial wall and subsequent computation of the distension waveform using Radio Frequency (RF) ultrasound signal. The approach was evaluated on ultrasound RF data acquired using a prototype ultrasound system from an artery mimicking flow phantom. The effectiveness of the proposed algorithm is demonstrated by comparing with existing wall tracking algorithms. The experimental results show that the proposed method provides 20% reduction in the error margin compared to the existing approaches in tracking the arterial wall movement. This approach coupled with ultrasound system can be used to estimate the arterial compliance parameters required for screening of cardiovascular related disorders.

Keywords: distension waveform, Gaussian Mixture Model, RF ultrasound, arterial wall movement

Procedia PDF Downloads 508
3162 A System for Preventing Inadvertent Exposition of Staff Present outside the Operating Theater: Description and Clinical Test

Authors: Aya Al Masri, Kamel Guerchouche, Youssef Laynaoui, Safoin Aktaou, Malorie Martin, Fouad Maaloul

Abstract:

Introduction: Mobile C-arms move throughout operating rooms of the operating theater. Being designed to move between rooms, they are not equipped with relays to retrieve the exposition information and export it outside the room. Therefore, no light signaling is available outside the room to warn the X-ray emission for staff. Inadvertent exposition of staff outside the operating theater is a real problem for radiation protection. The French standard NFC 15-160 require that: (1) access to any room containing an X-ray emitting device must be controlled by a light signage so that it cannot be inadvertently crossed, and (2) setting up an emergency button to stop the X-ray emission. This study presents a system that we developed to meet these requirements and the results of its clinical test. Materials and methods: The system is composed of two communicating boxes: o The "DetectBox" is to be installed inside the operating theater. It identifies the various operation states of the C-arm by analyzing its power supply signal. The DetectBox communicates (in wireless mode) with the second box (AlertBox). o The "AlertBox" can operate in socket or battery mode and is to be installed outside the operating theater. It detects and reports the state of the C-arm by emitting a real time light signal. This latter can have three different colors: red when the C-arm is emitting X-rays, orange when it is powered on but does not emit X-rays, and green when it is powered off. The two boxes communicate on a radiofrequency link exclusively carried out in the ‘Industrial, Scientific and Medical (ISM)’ frequency bands and allows the coexistence of several on-site warning systems without communication conflicts (interference). Taking into account the complexity of performing electrical works in the operating theater (for reasons of hygiene and continuity of medical care), this system (having a size <10 cm²) works in complete safety without any intrusion in the mobile C-arm and does not require specific electrical installation work. The system is equipped with emergency button that stops X-ray emission. The system has been clinically tested. Results: The clinical test of the system shows that: it detects X-rays having both high and low energy (50 – 150 kVp), high and low photon flow (0.5 – 200 mA: even when emitted for a very short time (<1 ms)), Probability of false detection < 10-5, it operates under all acquisition modes (continuous, pulsed, fluoroscopy mode, image mode, subtraction and movie mode), it is compatible with all C-arm models and brands. We have also tested the communication between the two boxes (DetectBox and AlertBox) in several conditions: (1) Unleaded room, (2) leaded room, and (3) rooms with particular configuration (sas, great distances, concrete walls, 3 mm of lead). The result of these last tests was positive. Conclusion: This system is a reliable tool to alert the staff present outside the operating room for X-ray emission and insure their radiation protection.

Keywords: Clinical test, Inadvertent staff exposition, Light signage, Operating theater

Procedia PDF Downloads 129
3161 A Heart Arrhythmia Prediction Using Machine Learning’s Classification Approach and the Concept of Data Mining

Authors: Roshani S. Golhar, Neerajkumar S. Sathawane, Snehal Dongre

Abstract:

Background and objectives: As the, cardiovascular illnesses increasing and becoming cause of mortality worldwide, killing around lot of people each year. Arrhythmia is a type of cardiac illness characterized by a change in the linearity of the heartbeat. The goal of this study is to develop novel deep learning algorithms for successfully interpreting arrhythmia using a single second segment. Because the ECG signal indicates unique electrical heart activity across time, considerable changes between time intervals are detected. Such variances, as well as the limited number of learning data available for each arrhythmia, make standard learning methods difficult, and so impede its exaggeration. Conclusions: The proposed method was able to outperform several state-of-the-art methods. Also proposed technique is an effective and convenient approach to deep learning for heartbeat interpretation, that could be probably used in real-time healthcare monitoring systems

Keywords: electrocardiogram, ECG classification, neural networks, convolutional neural networks, portable document format

Procedia PDF Downloads 72
3160 Enzyme Redesign: From Metal-Dependent to Metal-Independent, a Symphony Orchestra without Concertmasters

Authors: Li Na Zhao, Arieh Warshel

Abstract:

The design of enzymes is an extremely challenging task, and this is also true for metalloenzymes. In the case of naturally evolved enzymes, one may consider the active site residues as the musicians in the enzyme orchestra, while the metal can be considered as their concertmaster. Together they catalyze reactions as if they performed a masterpiece written by nature. The Lactonase can be thought as a member of the amidohydrolase family, with two concertmasters, Fe and Zn, at its active site. It catalyzes the quorum sensing signal- N-acyl homoserine lactones (AHLs or N-AHLs)- by hydrolyzing the lactone ring. This process, known as quorum quenching, provides a strategy in the treatment of infectious diseases without introducing selection pressure. However, the activity of lactonase is metal-dependent, and this dependence hampers the clinic usage. In our study, we use the empirical valence bond (EVB) approach to evaluate the catalytic contributions decomposing them to electrostatic and other components.

Keywords: enzyme redesign, empirical valence bond, lactonase, quorum quenching

Procedia PDF Downloads 255
3159 Ischemic Stroke Detection in Computed Tomography Examinations

Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina

Abstract:

Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.

Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means

Procedia PDF Downloads 370
3158 Processing Design of Miniature Casting Incorporating Stereolithography Technologies

Authors: Pei-Hsing Huang, Wei-Ju Huang

Abstract:

Investment casting is commonly used in the production of metallic components with complex shapes, due to its high dimensional precision, good surface finish, and low cost. However, the process is cumbersome, and the period between trial casting and final production can be very long, thereby limiting business opportunities and competitiveness. In this study, we replaced conventional wax injection with stereolithography (SLA) 3D printing to speed up the trial process and reduce costs. We also used silicone molds to further reduce costs to avoid the high costs imposed by photosensitive resin.

Keywords: investment casting, stereolithography, wax molding, 3D printing

Procedia PDF Downloads 408
3157 Raising the Property Provisions of the Topographic Located near the Locality of Gircov, Romania

Authors: Carmen Georgeta Dumitrache

Abstract:

Measurements of terrestrial science aims to study the totality of operations and computing, which are carried out for the purposes of representation on the plan or map of the land surface in a specific cartographic projection and topographic scale. With the development of society, the metrics have evolved, and they land, being dependent on the achievement of a goal-bound utility of economic activity and of a scientific purpose related to determining the form and dimensions of the Earth. For measurements in the field, data processing and proper representation on drawings and maps of planimetry and landform of the land, using topographic and geodesic instruments, calculation and graphical reporting, which requires a knowledge of theoretical and practical concepts from different areas of science and technology. In order to use properly in practice, topographical and geodetic instruments designed to measure precise angles and distances are required knowledge of geometric optics, precision mechanics, the strength of materials, and more. For processing, the results from field measurements are necessary for calculation methods, based on notions of geometry, trigonometry, algebra, mathematical analysis and computer science. To be able to illustrate topographic measurements was established for the lifting of property located near the locality of Gircov, Romania. We determine this total surface of the plan (T30), parcel/plot, but also in the field trace the coordinates of a parcel. The purpose of the removal of the planimetric consisted of: the exact determination of the bounding surface; analytical calculation of the surface; comparing the surface determined with the one registered in the documents produced; drawing up a plan of location and delineation with closeness and distance contour, as well as highlighting the parcels comprising this property; drawing up a plan of location and delineation with closeness and distance contour for a parcel from Dave; in the field trace outline of plot points from the previous point. The ultimate goal of this work was to determine and represent the surface, but also to tear off a plot of the surface total, while respecting the first surface condition imposed by the Act of the beneficiary's property.

Keywords: topography, surface, coordinate, modeling

Procedia PDF Downloads 261