Search results for: high resolution array processing techniques
8369 Medical Image Fusion Based On Redundant Wavelet Transform and Morphological Processing
Authors: P. S. Gomathi, B. Kalaavathi
Abstract:
The process in which the complementary information from multiple images is integrated to provide composite image that contains more information than the original input images is called image fusion. Medical image fusion provides useful information from multimodality medical images that provides additional information to the doctor for diagnosis of diseases in a better way. This paper represents the wavelet based medical image fusion algorithm on different multimodality medical images. In order to fuse the medical images, images are decomposed using Redundant Wavelet Transform (RWT). The high frequency coefficients are convolved with morphological operator followed by the maximum-selection (MS) rule. The low frequency coefficients are processed by MS rule. The reconstructed image is obtained by inverse RWT. The quantitative measures which includes Mean, Standard Deviation, Average Gradient, Spatial frequency, Edge based Similarity Measures are considered for evaluating the fused images. The performance of this proposed method is compared with Pixel averaging, PCA, and DWT fusion methods. When compared with conventional methods, the proposed framework provides better performance for analysis of multimodality medical images.
Keywords: Discrete Wavelet Transform (DWT), Image Fusion, Morphological Processing, Redundant Wavelet Transform (RWT).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21588368 The Influence of Project-Based Learning and Outcome-Based Education: Interior Design Tertiary Students in Focus
Authors: Omneya Messallam
Abstract:
Technology has been developed dramatically in most of the educational disciplines. For instance, digital rendering subject, which is being taught in both Interior and Architecture fields, is witnessing almost annually updated software versions. A lot of students and educators argued that there will be no need for manual rendering techniques to be learned. Therefore, the Interior Design Visual Presentation 1 course (ID133) has been chosen from the first level of the Interior Design (ID) undergraduate program, as it has been taught for six years continually. This time frame will facilitate sound observation and critical analysis of the use of appropriate teaching methodologies. Furthermore, the researcher believes in the high value of the manual rendering techniques. The course objectives are: to define the basic visual rendering principles, to recall theories and uses of various types of colours and hatches, to raise the learners’ awareness of the value of studying manual render techniques, and to prepare them to present their work professionally. The students are female Arab learners aged between 17 and 20. At the outset of the course, the majority of them demonstrated negative attitude, lacking both motivation and confidence in manual rendering skills. This paper is a reflective appraisal of deploying two student-centred teaching pedagogies which are: Project-based learning (PBL) and Outcome-based education (OBE) on ID133 students. This research aims of developing some teaching strategies to enhance the quality of teaching in this given course over an academic semester. The outcome of this research emphasized the positive influence of applying such educational methods on improving the quality of students’ manual rendering skills in terms of: materials, textiles, textures, lighting, and shade and shadow. Furthermore, it greatly motivated the students and raised the awareness of the importance of learning the manual rendering techniques.
Keywords: Manual renders, outcome-based education, project-based learning, personal competences, and visual presentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8478367 Levenberg-Marquardt Algorithm for Karachi Stock Exchange Share Rates Forecasting
Authors: Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C. Ardil
Abstract:
Financial forecasting is an example of signal processing problems. A number of ways to train/learn the network are available. We have used Levenberg-Marquardt algorithm for error back-propagation for weight adjustment. Pre-processing of data has reduced much of the variation at large scale to small scale, reducing the variation of training data.
Keywords: Gradient descent method, jacobian matrix.Levenberg-Marquardt algorithm, quadratic error surfaces,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24748366 High Performance Direct Torque Control for Induction Motor Drive Fed from Photovoltaic System
Authors: E. E. El-Kholy, Ahamed Kalas, Mahmoud Fauzy, M. El-Shahat Dessouki, Abdou. M. El-Refay, Mohammed El-Zefery
Abstract:
Direct Torque Control (DTC) is an AC drive control method especially designed to provide fast and robust responses. In this paper a progressive algorithm for direct torque control of threephase induction drive system supplied by photovoltaic arrays using voltage source inverter to control motor torque and flux with maximum power point tracking at different level of insolation is presented. Experimental results of the new DTC method obtained by an experimental rapid prototype system for drives are presented. Simulation and experimental results confirm that the proposed system gives quick, robust torque and speed responses at constant switching frequencies.
Keywords: Photovoltaic (PV) array, direct torque control (DTC), constant switching frequency, induction motor, maximum power point tracking (MPPT).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22578365 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces
Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet
Abstract:
In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.Keywords: Dropwise condensation, textured surface, image processing, watershed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6918364 Image Processing on Geosynthetic Reinforced Layers to Evaluate Shear Strength and Variations of the Strain Profiles
Authors: S. K. Khosrowshahi, E. Güler
Abstract:
This study investigates the reinforcement function of geosynthetics on the shear strength and strain profile of sand. Conducting a series of simple shear tests, the shearing behavior of the samples under static and cyclic loads was evaluated. Three different types of geosynthetics including geotextile and geonets were used as the reinforcement materials. An image processing analysis based on the optical flow method was performed to measure the lateral displacements and estimate the shear strains. It is shown that besides improving the shear strength, the geosynthetic reinforcement leads a remarkable reduction on the shear strains. The improved layer reduces the required thickness of the soil layer to resist against shear stresses. Consequently, the geosynthetic reinforcement can be considered as a proper approach for the sustainable designs, especially in the projects with huge amount of geotechnical applications like subgrade of the pavements, roadways, and railways.Keywords: Image processing, soil reinforcement, geosynthetics, simple shear test, shear strain profile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10458363 Parallel Vector Processing Using Multi Level Orbital DATA
Authors: Nagi Mekhiel
Abstract:
Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.Keywords: Memory organization, parallel processors, serial code, vector processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10628362 A Novel Spectral Index for Automatic Shadow Detection in Urban Mapping Based On WorldView-2 Satellite Imagery
Authors: Kaveh Shahi, Helmi Z. M. Shafri, Ebrahim Taherzadeh
Abstract:
In remote sensing, shadow causes problems in many applications such as change detection and classification. It is caused by objects which are elevated, thus can directly affect the accuracy of information. For these reasons, it is very important to detect shadows particularly in urban high spatial resolution imagery which created a significant problem. This paper focuses on automatic shadow detection based on a new spectral index for multispectral imagery known as Shadow Detection Index (SDI). The new spectral index was tested on different areas of WorldView-2 images and the results demonstrated that the new spectral index has a massive potential to extract shadows with accuracy of 94% effectively and automatically. Furthermore, the new shadow detection index improved road extraction from 82% to 93%.
Keywords: Spectral index, shadow detection, remote sensing images, WorldView-2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33258361 A New Type of Integration Error and its Influence on Integration Testing Techniques
Authors: P. Prema, B. Ramadoss
Abstract:
Testing is an activity that is required both in the development and maintenance of the software development life cycle in which Integration Testing is an important activity. Integration testing is based on the specification and functionality of the software and thus could be called black-box testing technique. The purpose of integration testing is testing integration between software components. In function or system testing, the concern is with overall behavior and whether the software meets its functional specifications or performance characteristics or how well the software and hardware work together. This explains the importance and necessity of IT for which the emphasis is on interactions between modules and their interfaces. Software errors should be discovered early during IT to reduce the costs of correction. This paper introduces a new type of integration error, presenting an overview of Integration Testing techniques with comparison of each technique and also identifying which technique detects what type of error.Keywords: Integration Error, Integration Error Types, Integration Testing Techniques, Software Testing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22188360 Comparison of Particle Swarm Optimization and Genetic Algorithm for TCSC-based Controller Design
Authors: Sidhartha Panda, N. P. Padhy
Abstract:
Recently, genetic algorithms (GA) and particle swarm optimization (PSO) technique have attracted considerable attention among various modern heuristic optimization techniques. Since the two approaches are supposed to find a solution to a given objective function but employ different strategies and computational effort, it is appropriate to compare their performance. This paper presents the application and performance comparison of PSO and GA optimization techniques, for Thyristor Controlled Series Compensator (TCSC)-based controller design. The design objective is to enhance the power system stability. The design problem of the FACTS-based controller is formulated as an optimization problem and both the PSO and GA optimization techniques are employed to search for optimal controller parameters. The performance of both optimization techniques in terms of computational time and convergence rate is compared. Further, the optimized controllers are tested on a weakly connected power system subjected to different disturbances, and their performance is compared with the conventional power system stabilizer (CPSS). The eigenvalue analysis and non-linear simulation results are presented and compared to show the effectiveness of both the techniques in designing a TCSC-based controller, to enhance power system stability.
Keywords: Thyristor Controlled Series Compensator, geneticalgorithm; particle swarm optimization; Phillips-Heffron model;power system stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31538359 Adjustment and Compensation Techniques for the Rotary Axes of Five-axis CNC Machine Tools
Authors: Tung-Hui Hsu, Wen-Yuh Jywe
Abstract:
Five-axis computer numerical control (CNC) machine tools (three linear and two rotary axes) are ideally suited to the fabrication of complex work pieces, such as dies, turbo blades, and cams. The locations of the axis average line and centerline of the rotary axes strongly influence the performance of these machines; however, techniques to compensate for eccentric error in the rotary axes remain weak. This paper proposes optical (Non-Bar) techniques capable of calibrating five-axis CNC machine tools and compensating for eccentric error in the rotary axes. This approach employs the measurement path in ISO/CD 10791-6 to determine the eccentric error in two rotary axes, for which compensatory measures can be implemented. Experimental results demonstrate that the proposed techniques can improve the performance of various five-axis CNC machine tools by more than 90%. Finally, a result of the cutting test using a B-type five-axis CNC machine tool confirmed to the usefulness of this proposed compensation technique.
Keywords: Calibration, compensation, rotary axis, five-axis computer numerical control (CNC) machine tools, eccentric error, optical calibration system, ISO/CD 10791-6
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41688358 An Intelligent Nondestructive Testing System of Ultrasonic Infrared Thermal Imaging Based on Embedded Linux
Authors: Hao Mi, Ming Yang, Tian-yue Yang
Abstract:
Ultrasonic infrared nondestructive testing is a kind of testing method with high speed, accuracy and localization. However, there are still some problems, such as the detection requires manual real-time field judgment, the methods of result storage and viewing are still primitive. An intelligent non-destructive detection system based on embedded linux is put forward in this paper. The hardware part of the detection system is based on the ARM (Advanced Reduced Instruction Set Computer Machine) core and an embedded linux system is built to realize image processing and defect detection of thermal images. The CLAHE algorithm and the Butterworth filter are used to process the thermal image, and then the boa server and CGI (Common Gateway Interface) technology are used to transmit the test results to the display terminal through the network for real-time monitoring and remote monitoring. The system also liberates labor and eliminates the obstacle of manual judgment. According to the experiment result, the system provides a convenient and quick solution for industrial non-destructive testing.Keywords: Remote monitoring, non-destructive testing, embedded linux system, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9678357 Application of Interferometric Techniques for Quality Control of Oils Used in the Food Industry
Authors: Andres Piña, Amy Meléndez, Pablo Cano, Tomas Cahuich
Abstract:
The purpose of this project is to propose a quick and environmentally friendly alternative to measure the quality of oils used in food industry. There is evidence that repeated and indiscriminate use of oils in food processing cause physicochemical changes with formation of potentially toxic compounds that can affect the health of consumers and cause organoleptic changes. In order to assess the quality of oils, non-destructive optical techniques such as Interferometry offer a rapid alternative to the use of reagents, using only the interaction of light on the oil. Through this project, we used interferograms of samples of oil placed under different heating conditions to establish the changes in their quality. These interferograms were obtained by means of a Mach-Zehnder Interferometer using a beam of light from a HeNe laser of 10mW at 632.8nm. Each interferogram was captured, analyzed and measured full width at half-maximum (FWHM) using the software from Amcap and ImageJ. The total of FWHMs was organized in three groups. It was observed that the average obtained from each of the FWHMs of group A shows a behavior that is almost linear, therefore it is probable that the exposure time is not relevant when the oil is kept under constant temperature. Group B exhibits a slight exponential model when temperature raises between 373 K and 393 K. Results of the t-Student show a probability of 95% (0.05) of the existence of variation in the molecular composition of both samples. Furthermore, we found a correlation between the Iodine Indexes (Physicochemical Analysis) and the Interferograms (Optical Analysis) of group C. Based on these results, this project highlights the importance of the quality of the oils used in food industry and shows how Interferometry can be a useful tool for this purpose.Keywords: Food industry, interferometric, oils, quality control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21808356 Achieving Success in NPD Projects
Authors: Ankush Agrawal, Nadia Bhuiyan
Abstract:
The new product development (NPD) literature emphasizes the importance of introducing new products on the market for continuing business success. New products are responsible for employment, economic growth, technological progress, and high standards of living. Therefore, the study of NPD and the processes through which they emerge is important. The goal of our research is to propose a framework of critical success factors, metrics, and tools and techniques for implementing metrics for each stage of the new product development (NPD) process. An extensive literature review was undertaken to investigate decades of studies on NPD success and how it can be achieved. These studies were scanned for common factors for firms that enjoyed success of new products on the market. The paper summarizes NPD success factors, suggests metrics that should be used to measure these factors, and proposes tools and techniques to make use of these metrics. This was done for each stage of the NPD process, and brought together in a framework that the authors propose should be followed for complex NPD projects. While many studies have been conducted on critical success factors for NPD, these studies tend to be fragmented and focus on one or a few phases of the NPD process.
Keywords: New product development, performance, critical success factors, framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24478355 A CFD Study of Heat Transfer Enhancement in Pipe Flow with Al2O3 Nanofluid
Authors: P.Kumar
Abstract:
Fluids are used for heat transfer in many engineering equipments. Water, ethylene glycol and propylene glycol are some of the common heat transfer fluids. Over the years, in an attempt to reduce the size of the equipment and/or efficiency of the process, various techniques have been employed to improve the heat transfer rate of these fluids. Surface modification, use of inserts and increased fluid velocity are some examples of heat transfer enhancement techniques. Addition of milli or micro sized particles to the heat transfer fluid is another way of improving heat transfer rate. Though this looks simple, this method has practical problems such as high pressure loss, clogging and erosion of the material of construction. These problems can be overcome by using nanofluids, which is a dispersion of nanosized particles in a base fluid. Nanoparticles increase the thermal conductivity of the base fluid manifold which in turn increases the heat transfer rate. In this work, the heat transfer enhancement using aluminium oxide nanofluid has been studied by computational fluid dynamic modeling of the nanofluid flow adopting the single phase approach.Keywords: Heat transfer intensification, nanofluid, CFD, friction factor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37968354 Orchestra/Percussion Classification Algorithm for United Speech Audio Coding System
Authors: Yueming Wang, Rendong Ying, Sumxin Jiang, Peilin Liu
Abstract:
Unified Speech Audio Coding (USAC), the latest MPEG standardization for unified speech and audio coding, uses a speech/audio classification algorithm to distinguish speech and audio segments of the input signal. The quality of the recovered audio can be increased by well-designed orchestra/percussion classification and subsequent processing. However, owing to the shortcoming of the system, introducing an orchestra/percussion classification and modifying subsequent processing can enormously increase the quality of the recovered audio. This paper proposes an orchestra/percussion classification algorithm for the USAC system which only extracts 3 scales of Mel-Frequency Cepstral Coefficients (MFCCs) rather than traditional 13 scales of MFCCs and use Iterative Dichotomiser 3 (ID3) Decision Tree rather than other complex learning method, thus the proposed algorithm has lower computing complexity than most existing algorithms. Considering that frequent changing of attributes may lead to quality loss of the recovered audio signal, this paper also design a modified subsequent process to help the whole classification system reach an accurate rate as high as 97% which is comparable to classical 99%.
Keywords: ID3 Decision Tree, MFCC, Orchestra/Percussion Classification, USAC
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16738353 Empirical Survey of the Solar System Based on the Fusion of GPS and Image Processing
Authors: S. Divya Gnanarathinam, S. Sundaramurthy
Abstract:
The tremendous increase in the population of the world creates the immediate need for the energy resources. All the people in the world need the sustainable energy resources which have low costs. Solar energy is appraised as one of the main energy resources in warm countries. The areas in the west of India like Rajasthan, Gujarat, etc. are immensely rich in solar energy resources. This paper deals with the development of dual axis solar tracker using Arduino board. Depending on the astronomical estimates of the sun from the GPS and sensor image processing outcomes, a methodology is proposed to locate the position of the sun to obtain the maximum solar energy. Based on the outcomes, the solar tracking system figures out whether to use image processing outcomes or astronomical estimates to attain the maximum efficiency of the solar panel. Finally, the experimental values obtained from the solar tracker for both the sunny and the rainy days are being tabulated.
Keywords: Dual axis solar tracker, Arduino board, LDR sensors, global positioning system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15888352 Automatic Segmentation of Retina Vessels by Using Zhang Method
Authors: Ehsan Saghapour, Somayeh Zandian
Abstract:
Image segmentation is an important step in image processing. Major developments in medical imaging allow physicians to use potent and non-invasive methods in order to evaluate structures, performance and to diagnose human diseases. In this study, an active contour was used to extract vessel networks from color retina images. Automatic analysis of retina vessels facilitates calculation of arterial index which is required to diagnose some certain retinopathies.Keywords: Active contour, retinal vessel segmentation, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23748351 Mapping Paddy Rice Agriculture using Multi-temporal FORMOSAT-2 Images
Authors: Yi-Shiang Shiu, Meng-Lung Lin, Kang-Tsung Chang, Tzu-How Chu
Abstract:
Most paddy rice fields in East Asia are small parcels, and the weather conditions during the growing season are usually cloudy. FORMOSAT-2 multi-spectral images have an 8-meter resolution and one-day recurrence, ideal for mapping paddy rice fields in East Asia. To map rice fields, this study first determined the transplanting and the most active tillering stages of paddy rice and then used multi-temporal images to distinguish different growing characteristics between paddy rice and other ground covers. The unsupervised ISODATA (iterative self-organizing data analysis techniques) and supervised maximum likelihood were both used to discriminate paddy rice fields, with training areas automatically derived from ten-year cultivation parcels in Taiwan. Besides original bands in multi-spectral images, we also generated normalized difference vegetation index and experimented with object-based pre-classification and post-classification. This paper discusses results of different image classification methods in an attempt to find a precise and automatic solution to mapping paddy rice in Taiwan.Keywords: paddy rice fields; multi-temporal; FORMOSAT-2images, normalized difference vegetation index, object-basedclassification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17968350 Effect of Injection Moulding Process Parameter on Tensile Strength Using Taguchi Method
Authors: Gurjeet Singh, M. K. Pradhan, Ajay Verma
Abstract:
The plastic industry plays very important role in the economy of any country. It is generally among the leading share of the economy of the country. Since metals and their alloys are very rarely available on the earth. Therefore, to produce plastic products and components, which finds application in many industrial as well as household consumer products is beneficial. Since 50% plastic products are manufactured by injection moulding process. For production of better quality product, we have to control quality characteristics and performance of the product. The process parameters plays a significant role in production of plastic, hence the control of process parameter is essential. In this paper the effect of the parameters selection on injection moulding process has been described. It is to define suitable parameters in producing plastic product. Selecting the process parameter by trial and error is neither desirable nor acceptable, as it is often tends to increase the cost and time. Hence, optimization of processing parameter of injection moulding process is essential. The experiments were designed with Taguchi’s orthogonal array to achieve the result with least number of experiments. Plastic material polypropylene is studied. Tensile strength test of material is done on universal testing machine, which is produced by injection moulding machine. By using Taguchi technique with the help of MiniTab-14 software the best value of injection pressure, melt temperature, packing pressure and packing time is obtained. We found that process parameter packing pressure contribute more in production of good tensile plastic product.
Keywords: Injection moulding, tensile strength, Taguchi method, poly-propylene.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37648349 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies
Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk
Abstract:
Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, these projects propose AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project present the best-in-school techniques used to preserve data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptography techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures, and identifies potential correction/mitigation measures.
Keywords: Data privacy, artificial intelligence, healthcare AI, data sharing, healthcare organizations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168348 MIMO System Order Reduction Using Real-Coded Genetic Algorithm
Authors: Swadhin Ku. Mishra, Sidhartha Panda, Simanchala Padhy, C. Ardil
Abstract:
In this paper, real-coded genetic algorithm (RCGA) optimization technique has been applied for large-scale linear dynamic multi-input-multi-output (MIMO) system. The method is based on error minimization technique where the integral square error between the transient responses of original and reduced order models has been minimized by RCGA. The reduction procedure is simple computer oriented and the approach is comparable in quality with the other well-known reduction techniques. Also, the proposed method guarantees stability of the reduced model if the original high-order MIMO system is stable. The proposed approach of MIMO system order reduction is illustrated with the help of an example and the results are compared with the recently published other well-known reduction techniques to show its superiority.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22628347 Comparison of MFCC and Cepstral Coefficients as a Feature Set for PCG Biometric Systems
Authors: Justin Leo Cheang Loong, Khazaimatol S Subari, Muhammad Kamil Abdullah, Nurul Nadia Ahmad, RosliBesar
Abstract:
Heart sound is an acoustic signal and many techniques used nowadays for human recognition tasks borrow speech recognition techniques. One popular choice for feature extraction of accoustic signals is the Mel Frequency Cepstral Coefficients (MFCC) which maps the signal onto a non-linear Mel-Scale that mimics the human hearing. However the Mel-Scale is almost linear in the frequency region of heart sounds and thus should produce similar results with the standard cepstral coefficients (CC). In this paper, MFCC is investigated to see if it produces superior results for PCG based human identification system compared to CC. Results show that the MFCC system is still superior to CC despite linear filter-banks in the lower frequency range, giving up to 95% correct recognition rate for MFCC and 90% for CC. Further experiments show that the high recognition rate is due to the implementation of filter-banks and not from Mel-Scaling.Keywords: Biometric, Phonocardiogram, Cepstral Coefficients, Mel Frequency
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35538346 UWB Bowtie Slot Antenna for Breast Cancer Detection
Authors: N. Seladji-Hassaine, L. Merad, S.M. Meriah, F.T. Bendimerad
Abstract:
UWB is a very attractive technology for many applications. It provides many advantages such as fine resolution and high power efficiency. Our interest in the current study is the use of UWB radar technique in microwave medical imaging systems, especially for early breast cancer detection. The Federal Communications Commission FCC allowed frequency bandwidth of 3.1 to 10.6 GHz for this purpose. In this paper we suggest an UWB Bowtie slot antenna with enhanced bandwidth. Effects of varying the geometry of the antenna on its performance and bandwidth are studied. The proposed antenna is simulated in CST Microwave Studio. Details of antenna design and simulation results such as return loss and radiation patterns are discussed in this paper. The final antenna structure exhibits good UWB characteristics and has surpassed the bandwidth requirements.Keywords: Ultra Wide Band (UWB), microwave imaging system, Bowtie antenna, return loss, impedance bandwidth enhancement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39788345 Interfacial Layer Effect on Novel p-Ni1-xO:Li/n-Si Heterojunction Solar Cells
Authors: Feng-Hao Hsu, Na-Fu Wang, Yu-Zen Tsai, Yu-Song Cheng, Cheng-Fu Yang, Mau-Phon Houng
Abstract:
This study fabricates p-type Ni1−xO:Li/n-Si heterojunction solar cells (P+/n HJSCs) by using radio frequency (RF) magnetron sputtering and investigates the effect of substrate temperature on photovoltaic cell properties. Grazing incidence x-ray diffraction, four point probe, and ultraviolet-visible-near infrared discover the optoelectrical properties of p-Ni1-xO thin films. The results show that p-Ni1-xO thin films deposited at 300 oC has the highest grain size (22.4 nm), average visible transmittance (~42%), and electrical resistivity (2.7 Ωcm). However, the conversion efficiency of cell is shown only 2.33% which is lower than the cell (3.39%) fabricated at room temperature. This result can be mainly attributed to interfacial layer thickness (SiOx) reduces from 2.35 nm to 1.70 nm, as verified by high-resolution transmission electron microscopy.
Keywords: Heterojunction, nickel oxide, solar cells, sputtering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18548344 Reconstitute Information about Discontinued Water Quality Variables in the Nile Delta Monitoring Network Using Two Record Extension Techniques
Authors: Bahaa Khalil, Taha B. M. J. Ouarda, André St-Hilaire
Abstract:
The world economic crises and budget constraints have caused authorities, especially those in developing countries, to rationalize water quality monitoring activities. Rationalization consists of reducing the number of monitoring sites, the number of samples, and/or the number of water quality variables measured. The reduction in water quality variables is usually based on correlation. If two variables exhibit high correlation, it is an indication that some of the information produced may be redundant. Consequently, one variable can be discontinued, and the other continues to be measured. Later, the ordinary least squares (OLS) regression technique is employed to reconstitute information about discontinued variable by using the continuously measured one as an explanatory variable. In this paper, two record extension techniques are employed to reconstitute information about discontinued water quality variables, the OLS and the Line of Organic Correlation (LOC). An empirical experiment is conducted using water quality records from the Nile Delta water quality monitoring network in Egypt. The record extension techniques are compared for their ability to predict different statistical parameters of the discontinued variables. Results show that the OLS is better at estimating individual water quality records. However, results indicate an underestimation of the variance in the extended records. The LOC technique is superior in preserving characteristics of the entire distribution and avoids underestimation of the variance. It is concluded from this study that the OLS can be used for the substitution of missing values, while LOC is preferable for inferring statements about the probability distribution.Keywords: Record extension, record augmentation, monitoringnetworks, water quality indicators.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16128343 Vector Control of Multimotor Drive
Authors: Archana S. Nanoty, A. R. Chudasama
Abstract:
Three-phase induction machines are today a standard for industrial electrical drives. Cost, reliability, robustness and maintenance free operation are among the reasons these machines are replacing dc drive systems. The development of power electronics and signal processing systems has eliminated one of the greatest disadvantages of such ac systems, which is the issue of control. With modern techniques of field oriented vector control, the task of variable speed control of induction machines is no longer a disadvantage. The need to increase system performance, particularly when facing limits on the power ratings of power supplies and semiconductors, motivates the use of phase number other than three, In this paper a novel scheme of connecting two, three phase induction motors in parallel fed by two inverters; viz. VSI and CSI and their vector control is presented.Keywords: Field oriented control, multiphase induction motor, power electronics converter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33828342 Banking Union: A New Step towards Completing the Economic and Monetary Union
Authors: Marijana Ivanov, Roman Šubić
Abstract:
This study analyzes the critical gaps in the architecture of European stability and the expected role of the banking union as the new important step towards completing the Economic and Monetary Union that should enable the creation of safe and sound financial sector for the euro area market. The single rulebook together with the Single Supervisory Mechanism and the Single Resolution Mechanism - as two main pillars of the banking union, should provide a consistent application of common rules and administrative standards for supervision, recovery and resolution of banks – with the final aim of replacing the former bail-out practice with the bail-in system through which possible future bank failures would be resolved by their own funds, i.e. with minimal costs for taxpayers and real economy. In this way, the vicious circle between banks and sovereigns would be broken. It would also reduce the financial fragmentation recorded in the years of crisis as the result of divergent behaviors in risk premium, lending activities and interest rates between the core and the periphery. In addition, it should strengthen the effectiveness of monetary transmission channels, in particular the credit channels and overflows of liquidity on the money market which, due to the fragmentation of the common financial market, has been significantly disabled in period of crisis. However, contrary to all the positive expectations related to the future functioning of the banking union, major findings of this study indicate that characteristics of the economic system in which the banking union will operate should not be ignored. The euro area is an integration of strong and weak entities with large differences in economic development, wealth, assets of banking systems, growth rates and accountability of fiscal policy. The analysis indicates that low and unbalanced economic growth remains a challenge for the maintenance of financial stability and this problem cannot be resolved just by a single supervision. In many countries bank assets exceed their GDP by several times and large banks are still a matter of concern, because of their systemic importance for individual countries and the euro zone as a whole. The creation of the Single Supervisory Mechanism and the Single Resolution Mechanism is a response to the European crisis, which has particularly affected peripheral countries and caused the associated loop between the banking crisis and the sovereign debt crisis, but has also influenced banks’ balance sheets in the core countries, as the result of crossborder capital flows. The creation of the SSM and the SRM should prevent the similar episodes to happen again and should also provide a new opportunity for strengthening of economic and financial systems of the peripheral countries. On the other hand, there is a potential threat that future focus of the ECB, resolution mechanism and other relevant institutions will be extremely oriented towards large and significant banks (whereby one half of them operate in the core and most important euro area countries), and therefore it remains questionable to what extent will the common resolution funds will be used for rescue of less important institutions. Recent geopolitical developments will be the optimal indicator to show whether the previously established mechanisms are sufficient enough to maintain the adequate financial stability in the euro area market.Keywords: Banking Union, financial integration, single supervisory mechanism (SSM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16658341 An Improved Cuckoo Search Algorithm for Voltage Stability Enhancement in Power Transmission Networks
Authors: Reza Sirjani, Nobosse Tafem Bolan
Abstract:
Many optimization techniques available in the literature have been developed in order to solve the problem of voltage stability enhancement in power systems. However, there are a number of drawbacks in the use of previous techniques aimed at determining the optimal location and size of reactive compensators in a network. In this paper, an Improved Cuckoo Search algorithm is applied as an appropriate optimization algorithm to determine the optimum location and size of a Static Var Compensator (SVC) in a transmission network. The main objectives are voltage stability improvement and total cost minimization. The results of the presented technique are then compared with other available optimization techniques.
Keywords: Cuckoo search algorithm, optimization, power system, var compensators, voltage stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13478340 Characterization of 3D-MRP for Analyzing of Brain Balancing Index (BBI) Pattern
Authors: N. Fuad, M. N. Taib, R. Jailani, M. E. Marwan
Abstract:
This paper discusses on power spectral density (PSD) characteristics which are extracted from three-dimensional (3D) electroencephalogram (EEG) models. The EEG signal recording was conducted on 150 healthy subjects. Development of 3D EEG models involves pre-processing of raw EEG signals and construction of spectrogram images. Then, the values of maximum PSD were extracted as features from the model. These features are analyzed using mean relative power (MRP) and different mean relative power (DMRP) technique to observe the pattern among different brain balancing indexes. The results showed that by implementing these techniques, the pattern of brain balancing indexes can be clearly observed. Some patterns are indicates between index 1 to index 5 for left frontal (LF) and right frontal (RF).
Keywords: Power spectral density, 3D EEG model, brain balancing, mean relative power, different mean relative power.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1915