Search results for: treatment algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11607

Search results for: treatment algorithm

10797 Image Reconstruction Method Based on L0 Norm

Authors: Jianhong Xiang, Hao Xiang, Linyu Wang

Abstract:

Compressed sensing (CS) has a wide range of applications in sparse signal reconstruction. Aiming at the problems of low recovery accuracy and long reconstruction time of existing reconstruction algorithms in medical imaging, this paper proposes a corrected smoothing L0 algorithm based on compressed sensing (CSL0). First, an approximate hyperbolic tangent function (AHTF) that is more similar to the L0 norm is proposed to approximate the L0 norm. Secondly, in view of the "sawtooth phenomenon" in the steepest descent method and the problem of sensitivity to the initial value selection in the modified Newton method, the use of the steepest descent method and the modified Newton method are jointly optimized to improve the reconstruction accuracy. Finally, the CSL0 algorithm is simulated on various images. The results show that the algorithm proposed in this paper improves the reconstruction accuracy of the test image by 0-0. 98dB.

Keywords: smoothed L0, compressed sensing, image processing, sparse reconstruction

Procedia PDF Downloads 113
10796 Development of a Model Based on Wavelets and Matrices for the Treatment of Weakly Singular Partial Integro-Differential Equations

Authors: Somveer Singh, Vineet Kumar Singh

Abstract:

We present a new model based on viscoelasticity for the Non-Newtonian fluids.We use a matrix formulated algorithm to approximate solutions of a class of partial integro-differential equations with the given initial and boundary conditions. Some numerical results are presented to simplify application of operational matrix formulation and reduce the computational cost. Convergence analysis, error estimation and numerical stability of the method are also investigated. Finally, some test examples are given to demonstrate accuracy and efficiency of the proposed method.

Keywords: Legendre Wavelets, operational matrices, partial integro-differential equation, viscoelasticity

Procedia PDF Downloads 330
10795 Combination of Electrodialysis and Electrodeionization for Treatment of Condensate from Ammonium Nitrate Production

Authors: Lubomir Machuca, Vit Fara

Abstract:

Ammonium nitrate (AN) is produced by the reaction of ammonia and nitric acid, and a waste condensate is obtained. The condensate contains pure AN in concentration up to 10g/L. The salt content in the condensate is too high to discharge immediately into the river thus it must be treated. This study is concerned with the treatment of condensates from an industrial AN production by combination of electrodialysis (ED) and electrodeionization (EDI). The condensate concentration was in range 1.9–2.5g/L of AN. A pilot ED module with 25 membrane pairs following by a laboratory EDI module with 10 membrane pairs operated continuously during 800 hours. Results confirmed that the combination of ED and EDI is suitable for the condensate treatment.

Keywords: desalination, electrodialysis, electrodeionization, fertilizer industry

Procedia PDF Downloads 236
10794 FE Analysis of Blade-Disc Dovetail Joints Using Mortar Base Frictional Contact Formulation

Authors: Abbas Moradi, Mohsen Safajoy, Reza Yazdanparast

Abstract:

Analysis of blade-disc dovetail joints is one of the biggest challenges facing designers of aero-engines. To avoid comparatively expensive experimental full-scale tests, numerical methods can be used to simulate loaded disc-blades assembly. Mortar method provides a powerful and flexible tool for solving frictional contact problems. In this study, 2D frictional contact in dovetail has been analysed based on the mortar algorithm. In order to model the friction, the classical law of coulomb and moving friction cone algorithm is applied. The solution is then obtained by solving the resulting set of non-linear equations using an efficient numerical algorithm based on Newton–Raphson Method. The numerical results show that this approach has better convergence rate and accuracy than other proposed numerical methods.

Keywords: computational contact mechanics, dovetail joints, nonlinear FEM, mortar approach

Procedia PDF Downloads 345
10793 Offset Dependent Uniform Delay Mathematical Optimization Model for Signalized Traffic Network Using Differential Evolution Algorithm

Authors: Tahseen Saad, Halim Ceylan, Jonathan Weaver, Osman Nuri Çelik, Onur Gungor Sahin

Abstract:

A new concept of uniform delay offset dependent mathematical optimization problem is derived as the main objective for this study using a differential evolution algorithm. To control the coordination problem, which depends on offset selection and to estimate uniform delay based on the offset choice in a traffic signal network. The assumption is the periodic sinusoidal function for arrival and departure patterns. The cycle time is optimized at the entry links and the optimized value is used in the non-entry links as a common cycle time. The offset optimization algorithm is used to calculate the uniform delay at each link. The results are illustrated by using a case study and are compared with the canonical uniform delay model derived by Webster and the highway capacity manual’s model. The findings show new model minimizes the total uniform delay to almost half compared to conventional models. The mathematical objective function is robust. The algorithm convergence time is fast.

Keywords: area traffic control, traffic flow, differential evolution, sinusoidal periodic function, uniform delay, offset variable

Procedia PDF Downloads 272
10792 CO₂ Capture by Clay and Its Adsorption Mechanism

Authors: Jedli Hedi, Hedfi Hachem, Abdessalem Jbara, Slimi Khalifa

Abstract:

Natural and modified clay were used as an adsorbent for CO2 capture. Sample of clay was subjected to acid treatments to improve their textural properties, namely, its surface area and pore volume. The modifications were carried out by heating the clays at 120 °C and then by acid treatment with 3M sulphuric acid solution at boiling temperature for 10 h. The CO2 adsorption capacities of the acid-treated clay were performed out in a batch reactor. It was found that the clay sample treated with 3M H2SO4 exhibited the highest Brunauer–Emmett–Teller (BET) surface area (16.29–24.68 m2/g) and pore volume (0.056–0.064 cm3/g). After the acid treatment, the CO2 adsorption capacity of clay increased. The CO2 adsorption capacity of clay increased after the acid treatment. The CO2 adsorption by clay, were characterized by SEM, FTIR, ATD-ATG and BET method. For describing the phenomenon of CO2 adsorption for these materials, the adsorption isotherms were modeled using the Freundlich and Langmuir models. CO2 adsorption isotherm was found attributable to physical adsorption.

Keywords: clay, acid treatment, CO2 capture, adsorption mechanism

Procedia PDF Downloads 207
10791 Evaluation of the MCFLIRT Correction Algorithm in Head Motion from Resting State fMRI Data

Authors: V. Sacca, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone

Abstract:

In the last few years, resting-state functional MRI (rs-fMRI) was widely used to investigate the architecture of brain networks by investigating the Blood Oxygenation Level Dependent response. This technique represented an interesting, robust and reliable approach to compare pathologic and healthy subjects in order to investigate neurodegenerative diseases evolution. On the other hand, the elaboration of rs-fMRI data resulted to be very prone to noise due to confounding factors especially the head motion. Head motion has long been known to be a source of artefacts in task-based functional MRI studies, but it has become a particularly challenging problem in recent studies using rs-fMRI. The aim of this work was to evaluate in MS patients a well-known motion correction algorithm from the FMRIB's Software Library - MCFLIRT - that could be applied to minimize the head motion distortions, allowing to correctly interpret rs-fMRI results.

Keywords: head motion correction, MCFLIRT algorithm, multiple sclerosis, resting state fMRI

Procedia PDF Downloads 210
10790 Directly Observed Treatment Short-Course (DOTS) for TB Control Program: A Ten Years Experience

Authors: Solomon Sisay, Belete Mengistu, Woldargay Erku, Desalegne Woldeyohannes

Abstract:

Background: Tuberculosis is still the leading cause of illness in the world which accounted for 2.5% of the global burden of disease, and 25% of all avoidable deaths in developing countries. Objectives: The aim of study was to assess impact of DOTS strategy on tuberculosis case finding and treatment outcome in Gambella Regional State, Ethiopia from 2003 up to 2012 and from 2002 up to 2011, respectively. Methods: Health facility-based retrospective study was conducted. Data were collected and reported in quarterly basis using WHO reporting format for TB case finding and treatment outcome from all DOTS implementing health facilities in all zones of the region to Federal Ministry of Health. Results: A total of 10024 all form of TB cases had been registered between the periods from 2003 up to 2012. Of them, 4100 (40.9%) were smear-positive pulmonary TB, 3164 (31.6%) were smear-negative pulmonary TB and 2760 (27.5%) had extra-pulmonary TB. Case detection rate of smear-positive pulmonary TB had increased from 31.7% to 46.5% from the total TB cases and treatment success rate increased from 13% to 92% with average mean value of being 40.9% (SD= 0.1) and 55.7% (SD=0.28), respectively for the specified year periods. Moreover, the average values of treatment defaulter and treatment failure rates were 4.2% and 0.3%, respectively. Conclusion: It is possible to achieve the recommended WHO target which is 70% of CDR for smear-positive pulmonary TB, and 85% of TSR as it was already been fulfilled the targets for treatments more than 85% from 2009 up to 2011 in the region. However, it requires strong efforts to enhance case detection rate of 40.9% for smear-positive pulmonary TB through implementing alternative case finding strategies.

Keywords: Gambella Region, case detection rate, directly observed treatment short-course, treatment success rate, tuberculosis

Procedia PDF Downloads 340
10789 Fixed Point of Lipschitz Quasi Nonexpansive Mappings

Authors: Maryam Moosavi, Hadi Khatibzadeh

Abstract:

The main purpose of this paper is to study the proximal point algorithm for quasi-nonexpansive mappings in Hadamard spaces. △-convergence and strong convergence of cyclic resolvents for a finite family of quasi-nonexpansive mappings one to a fixed point of the mappings are established

Keywords: Fixed point, Hadamard space, Proximal point algorithm, Quasi-nonexpansive sequence of mappings, Resolvent

Procedia PDF Downloads 85
10788 Performance Analysis of Proprietary and Non-Proprietary Tools for Regression Testing Using Genetic Algorithm

Authors: K. Hema Shankari, R. Thirumalaiselvi, N. V. Balasubramanian

Abstract:

The present paper addresses to the research in the area of regression testing with emphasis on automated tools as well as prioritization of test cases. The uniqueness of regression testing and its cyclic nature is pointed out. The difference in approach between industry, with business model as basis, and academia, with focus on data mining, is highlighted. Test Metrics are discussed as a prelude to our formula for prioritization; a case study is further discussed to illustrate this methodology. An industrial case study is also described in the paper, where the number of test cases is so large that they have to be grouped as Test Suites. In such situations, a genetic algorithm proposed by us can be used to reconfigure these Test Suites in each cycle of regression testing. The comparison is made between a proprietary tool and an open source tool using the above-mentioned metrics. Our approach is clarified through several tables.

Keywords: APFD metric, genetic algorithm, regression testing, RFT tool, test case prioritization, selenium tool

Procedia PDF Downloads 430
10787 Cannabis for the Treatment of Drug Resistant Epilepsy in Children

Authors: Sarah E. Casey

Abstract:

Epilepsy is the most common neurological disorder in children and approximately one-third of children with epilepsy have seizures that are uncontrolled on anticonvulsants alone. Cannabidiol is shown to be an effective treatment at reducing the amount of breakthrough seizures experienced by children with drug resistant epilepsy. Improvements in quality of life and overall condition were noted during cannabidiol treatment. Adverse side effects were experienced and were generally mild to moderate in nature. Additional double-blind, controlled studies with a more diverse sample population and standardized dosing are needed to ensure the efficacy and safety of cannabidiol use in children with drug resistant epilepsy.

Keywords: cannabis, drug resistant epilepsy, children, epilepsy

Procedia PDF Downloads 218
10786 Regional Treatment Trends in Canada Derived from Pharmacy Records

Authors: John Chau, Tzvi Aviv

Abstract:

Cardiometabolic conditions (hypertension, diabetes, and hyperlipidemia) are major public health concerns. Analysis of all prescription records from about 10 million patients at the largest network of pharmacies in Canada reveals small year-over-year increases in the treatment prevalence of cardiometabolic diseases prior to the COVID-19 pandemic. Cardiometabolic treatment rates increase with age and are higher in males than females. Hypertension treatment rates were 24% in males and 19% in females in 2021. Diabetes treatment rates were 10% in males and 7% in females in 2021. Geospatial analysis using patient addresses reveals interesting differences among provinces and neighborhoods in Canada. Using digital surveys distributed among 8,504 Canadian adults, an increase in hypertension awareness with age and female gender was observed. However, 7% of seniors and 6% of middle-aged Canadians reported uncontrolled blood pressure (>140/90 mmHg). In addition, elevated blood pressure (130-139/80-89 mmHg) was reported by 20% of seniors and 14% of middle-aged Canadians.

Keywords: cardiometabolic conditions, diabetes, hypertension, precision public health

Procedia PDF Downloads 114
10785 Use of Residues from Water Treatment and Porcelain Coatings Industry for Producing Eco-Bricks

Authors: Flavio Araujo, Fabiolla Lima, Julio Lima, Paulo Scalize, Antonio Albuquerque, Heitor Reis

Abstract:

One of the great environmental problems in the management of water treatment (WTP) is on the disposal of waste generated during the treatment process. The same occurs with the waste generated during rectification of porcelain tiles. Despite environmental laws in Brazil the residues does not have an ecologically balanced destination. Thus, with the purpose to identify an environmentally sustainable disposal, residues were used to replace part of the soil, for production soil-cement bricks. It was used the residues from WTP and coatings industry Cecrisa (Brazil). Consequently, a greater amount of fine aggregate in the two samples of residues was found. The residue affects the quality of bricks produced, compared to the sample without residues. However, the results of compression and water absorption tests were obtained values that meet the standards, respectively 2.0 MPa and 20% absorption.

Keywords: water treatment residue, porcelain tile residue, WTP, brick

Procedia PDF Downloads 478
10784 Effect of Thermal Energy on Inorganic Coagulation for the Treatment of Industrial Wastewater

Authors: Abhishek Singh, Rajlakshmi Barman, Tanmay Shah

Abstract:

Coagulation is considered to be one of the predominant water treatment processes which improve the cost effectiveness of wastewater. The sole purpose of this experiment on thermal coagulation is to increase the efficiency and the rate of reaction. The process uses renewable sources of energy which comprises of improved and minimized time method in order to eradicate the water scarcity of the regions which are on the brink of depletion. This paper includes the various effects of temperature on the standard coagulation treatment of wastewater and their effect on water quality. In addition, the coagulation is done with the mix of bottom/fly-ash that will act as an adsorbent and removes most of the minor and macro particles by means of adsorption which not only helps to reduce the environmental burden of fly ash but also enhance economic benefit. Also, the method of sand filtration is amalgamated in the process. The sand filter is an environmentally-friendly wastewater treatment method, which is relatively simple and inexpensive. The existing parameters were satisfied with the experimental results obtained in this study and were found satisfactory. The initial turbidity of the wastewater is 162 NTU. The initial temperature of the wastewater is 27 C. The temperature variation of the entire process is 50 C-80 C. The concentration of alum in wastewater is 60mg/L-320mg/L. The turbidity range is 8.31-28.1 NTU after treatment. pH variation is 7.73-8.29. The effective time taken is 10 minutes for thermal mixing and sedimentation. The results indicate that the presence of thermal energy affects the coagulation treatment process. The influence of thermal energy on turbidity is assessed along with renewable energy sources and increase of the rate of reaction of the treatment process.

Keywords: adsorbent, sand filter, temperature, thermal coagulation

Procedia PDF Downloads 318
10783 Efficient Feature Fusion for Noise Iris in Unconstrained Environment

Authors: Yao-Hong Tsai

Abstract:

This paper presents an efficient fusion algorithm for iris images to generate stable feature for recognition in unconstrained environment. Recently, iris recognition systems are focused on real scenarios in our daily life without the subject’s cooperation. Under large variation in the environment, the objective of this paper is to combine information from multiple images of the same iris. The result of image fusion is a new image which is more stable for further iris recognition than each original noise iris image. A wavelet-based approach for multi-resolution image fusion is applied in the fusion process. The detection of the iris image is based on Adaboost algorithm and then local binary pattern (LBP) histogram is then applied to texture classification with the weighting scheme. Experiment showed that the generated features from the proposed fusion algorithm can improve the performance for verification system through iris recognition.

Keywords: image fusion, iris recognition, local binary pattern, wavelet

Procedia PDF Downloads 365
10782 A Parallel Implementation of k-Means in MATLAB

Authors: Dimitris Varsamis, Christos Talagkozis, Alkiviadis Tsimpiris, Paris Mastorocostas

Abstract:

The aim of this work is the parallel implementation of k-means in MATLAB, in order to reduce the execution time. Specifically, a new function in MATLAB for serial k-means algorithm is developed, which meets all the requirements for the conversion to a function in MATLAB with parallel computations. Additionally, two different variants for the definition of initial values are presented. In the sequel, the parallel approach is presented. Finally, the performance tests for the computation times respect to the numbers of features and classes are illustrated.

Keywords: K-means algorithm, clustering, parallel computations, Matlab

Procedia PDF Downloads 380
10781 Oil Pollution Analysis of the Ecuadorian Rainforest Using Remote Sensing Methods

Authors: Juan Heredia, Naci Dilekli

Abstract:

The Ecuadorian Rainforest has been polluted for almost 60 years with little to no regard to oversight, law, or regulations. The consequences have been vast environmental damage such as pollution and deforestation, as well as sickness and the death of many people and animals. The aim of this paper is to quantify and localize the polluted zones, which something that has not been conducted and is the first step for remediation. To approach this problem, multi-spectral Remote Sensing imagery was utilized using a novel algorithm developed for this study, based on four normalized indices available in the literature. The algorithm classifies the pixels in polluted or healthy ones. The results of this study include a new algorithm for pixel classification and quantification of the polluted area in the selected image. Those results were finally validated by ground control points found in the literature. The main conclusion of this work is that using hyperspectral images, it is possible to identify polluted vegetation. The future work is environmental remediation, in-situ tests, and more extensive results that would inform new policymaking.

Keywords: remote sensing, oil pollution quatification, amazon forest, hyperspectral remote sensing

Procedia PDF Downloads 154
10780 Familiarity with Nursing and Description of Nurses Duties

Authors: Narges Solaymani

Abstract:

Definition of Nurse: Nurse: A person who is educated and skilled in the field of scientific principles and professional skills of health care, treatment, and medical training of patients. Nursing is a very important profession in the societies of the world. Although in the past, all caregivers of the sick and disabled were called nurses, nowadays, a nurse is a person who has a university education in this field. There are nurses in bachelor's, master's, and doctoral degrees in nursing. New courses have been launched in the master's degree based on duty-oriented nurses. A nurse cannot have an independent treatment center but is a member of the treatment team in established treatment centers such as hospitals, clinics, or offices. Nurses can establish counseling centers and provide nursing services at home. According to the standards, the number of nurses should be three times the number of doctors or twice the number of hospital beds, or there should be three nurses for every thousand people. Also, international standards show that in the internal and surgical department, every 4 to 6 patients should have a nurse.

Keywords: nurse, intensive care, CPR, bandage

Procedia PDF Downloads 63
10779 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster

Authors: Trapti Sharma, Devesh Kumar Srivastava

Abstract:

This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.

Keywords: hadoop, mapreduce, k-mediod, validation, verification

Procedia PDF Downloads 363
10778 Fingerprint Image Encryption Using a 2D Chaotic Map and Elliptic Curve Cryptography

Authors: D. M. S. Bandara, Yunqi Lei, Ye Luo

Abstract:

Fingerprints are suitable as long-term markers of human identity since they provide detailed and unique individual features which are difficult to alter and durable over life time. In this paper, we propose an algorithm to encrypt and decrypt fingerprint images by using a specially designed Elliptic Curve Cryptography (ECC) procedure based on block ciphers. In addition, to increase the confusing effect of fingerprint encryption, we also utilize a chaotic-behaved method called Arnold Cat Map (ACM) for a 2D scrambling of pixel locations in our method. Experimental results are carried out with various types of efficiency and security analyses. As a result, we demonstrate that the proposed fingerprint encryption/decryption algorithm is advantageous in several different aspects including efficiency, security and flexibility. In particular, using this algorithm, we achieve a margin of about 0.1% in the test of Number of Pixel Changing Rate (NPCR) values comparing to the-state-of-the-art performances.

Keywords: arnold cat map, biometric encryption, block cipher, elliptic curve cryptography, fingerprint encryption, Koblitz’s encoding

Procedia PDF Downloads 200
10777 HR MRI CS Based Image Reconstruction

Authors: Krzysztof Malczewski

Abstract:

Magnetic Resonance Imaging (MRI) reconstruction algorithm using compressed sensing is presented in this paper. It is exhibited that the offered approach improves MR images spatial resolution in circumstances when highly undersampled k-space trajectories are applied. Compressed Sensing (CS) aims at signal and images reconstructing from significantly fewer measurements than were conventionally assumed necessary. Magnetic Resonance Imaging (MRI) is a fundamental medical imaging method struggles with an inherently slow data acquisition process. The use of CS to MRI has the potential for significant scan time reductions, with visible benefits for patients and health care economics. In this study the objective is to combine super-resolution image enhancement algorithm with CS framework benefits to achieve high resolution MR output image. Both methods emphasize on maximizing image sparsity on known sparse transform domain and minimizing fidelity. The presented algorithm considers the cardiac and respiratory movements.

Keywords: super-resolution, MRI, compressed sensing, sparse-sense, image enhancement

Procedia PDF Downloads 425
10776 Using Digitally Reconstructed Radiographs from Magnetic Resonance Images to Localize Pelvic Lymph Nodes on 2D X-Ray Simulator-Based Brachytherapy Treatment Planning

Authors: Mohammad Ali Oghabian, Reza Reiazi, Esmaeel Parsai, Mehdi Aghili, Ramin Jaberi

Abstract:

In this project a new procedure has been introduced for utilizing digitally reconstructed radiograph from MRI images in Brachytherapy treatment planning. This procedure enables us to localize the tumor volume and delineate the extent of critical structures in vicinity of tumor volume. The aim of this project was to improve the accuracy of dose delivered to targets of interest in 2D treatment planning system.

Keywords: brachytherapy, cervix, digitally reconstructed radiographs, lymph node

Procedia PDF Downloads 525
10775 Triangulations via Iterated Largest Angle Bisection

Authors: Yeonjune Kang

Abstract:

A triangulation of a planar region is a partition of that region into triangles. In the finite element method, triangulations are often used as the grid underlying a computation. In order to be suitable as a finite element mesh, a triangulation must have well-shaped triangles, according to criteria that depend on the details of the particular problem. For instance, most methods require that all triangles be small and as close to the equilateral shape as possible. Stated differently, one wants to avoid having either thin or flat triangles in the triangulation. There are many triangulation procedures, a particular one being the one known as the longest edge bisection algorithm described below. Starting with a given triangle, locate the midpoint of the longest edge and join it to the opposite vertex of the triangle. Two smaller triangles are formed; apply the same bisection procedure to each of these triangles. Continuing in this manner after n steps one obtains a triangulation of the initial triangle into 2n smaller triangles. The longest edge algorithm was first considered in the late 70’s. It was shown by various authors that this triangulation has the desirable properties for the finite element method: independently of the number of iterations the angles of these triangles cannot get too small; moreover, the size of the triangles decays exponentially. In the present paper we consider a related triangulation algorithm we refer to as the largest angle bisection procedure. As the name suggests, rather than bisecting the longest edge, at each step we bisect the largest angle. We study the properties of the resulting triangulation and prove that, while the general behavior resembles the one in the longest edge bisection algorithm, there are several notable differences as well.

Keywords: angle bisectors, geometry, triangulation, applied mathematics

Procedia PDF Downloads 398
10774 Traditional Drawing, BIM and Erudite Design Process

Authors: Maryam Kalkatechi

Abstract:

Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.

Keywords: erudite, data-sketch, algorithm design in architecture, design process

Procedia PDF Downloads 270
10773 Facial Biometric Privacy Using Visual Cryptography: A Fundamental Approach to Enhance the Security of Facial Biometric Data

Authors: Devika Tanna

Abstract:

'Biometrics' means 'life measurement' but the term is usually associated with the use of unique physiological characteristics to identify an individual. It is important to secure the privacy of digital face image that is stored in central database. To impart privacy to such biometric face images, first, the digital face image is split into two host face images such that, each of it gives no idea of existence of the original face image and, then each cover image is stored in two different databases geographically apart. When both the cover images are simultaneously available then only we can access that original image. This can be achieved by using the XM2VTS and IMM face database, an adaptive algorithm for spatial greyscale. The algorithm helps to select the appropriate host images which are most likely to be compatible with the secret image stored in the central database based on its geometry and appearance. The encryption is done using GEVCS which results in a reconstructed image identical to the original private image.

Keywords: adaptive algorithm, database, host images, privacy, visual cryptography

Procedia PDF Downloads 127
10772 Ischemic Stroke Detection in Computed Tomography Examinations

Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina

Abstract:

Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.

Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means

Procedia PDF Downloads 365
10771 A Non-Parametric Based Mapping Algorithm for Use in Audio Fingerprinting

Authors: Analise Borg, Paul Micallef

Abstract:

Over the past few years, the online multimedia collection has grown at a fast pace. Several companies showed interest to study the different ways to organize the amount of audio information without the need of human intervention to generate metadata. In the past few years, many applications have emerged on the market which are capable of identifying a piece of music in a short time. Different audio effects and degradation make it much harder to identify the unknown piece. In this paper, an audio fingerprinting system which makes use of a non-parametric based algorithm is presented. Parametric analysis is also performed using Gaussian Mixture Models (GMMs). The feature extraction methods employed are the Mel Spectrum Coefficients and the MPEG-7 basic descriptors. Bin numbers replaced the extracted feature coefficients during the non-parametric modelling. The results show that non-parametric analysis offer potential results as the ones mentioned in the literature.

Keywords: audio fingerprinting, mapping algorithm, Gaussian Mixture Models, MFCC, MPEG-7

Procedia PDF Downloads 414
10770 Effects of Ultraviolet Treatment on Microbiological Load and Phenolic Content of Vegetable Juice

Authors: Kubra Dogan, Fatih Tornuk

Abstract:

Due to increasing consumer demand for the high-quality food products and awareness regarding the health benefits of different nutrients in food minimal processing becomes more popular in modern food preservation. To date, heat treatment is often used for inactivation of spoilage microorganisms in foods. However, it may cause significant changes in the quality and nutritional properties of food. In order to overcome the detrimental effects of heat treatment, several alternatives of non-thermal microbial inactivation processes have been investigated. Ultraviolet (UV) inactivation is a promising and feasible method for better quality and longer shelf life as an alternative to heat treatment, which aims to inhibit spoilage and pathogenic microorganisms and to inactivate the enzymes in vegetable juice production. UV-C is a sub-class of UV treatment which shows the highest microcidal effect between 250-270 nm. The wavelength of 254 nm is used for the surface disinfection of certain liquid food products such as vegetable juice. Effects of UV-C treatment on microbiological load and quality parameter of vegetable juice which is a mix of celery, carrot, lemon and orange was investigated. Our results showed that storing of UV-C applied vegetable juice for three months, reduced the count of TMAB by 3.5 log cfu/g and yeast-mold by 2 log cfu/g compared to control sample. Total phenolic content was found to be 514.3 ± 0.6 mg gallic acid equivalent/L, and there wasn’t a significant difference compared to control. The present work suggests that UV-C treatment is an alternative method for disinfection of vegetable juice since it enables adequate microbial inactivation, longer shelf life and has minimal effect on degradation of quality parameters of vegetable juice.

Keywords: heat treatment, phenolic content, shelf life, ultraviolet (UV-C), vegetable juice

Procedia PDF Downloads 207
10769 Reverse Osmosis Application on Sewage Tertiary Treatment

Authors: Elisa K. Schoenell, Cristiano De Oliveira, Luiz R. H. Dos Santos, Alexandre Giacobbo, Andréa M. Bernardes, Marco A. S. Rodrigues

Abstract:

Water is an indispensable natural resource, which must be preserved to human activities as well the ecosystems. However, the sewage discharge has been contaminating water resources. Conventional treatment, such as physicochemical treatment followed by biological processes, has not been efficient to the complete degradation of persistent organic compounds, such as medicines and hormones. Therefore, the use of advanced technologies to sewage treatment has become urgent and necessary. The aim of this study was to apply Reverse Osmosis (RO) on sewage tertiary treatment from a Waste Water Treatment Plant (WWTP) in south Brazil. It was collected 200 L of sewage pre-treated by wetland with aquatic macrophytes. The sewage was treated in a RO pilot plant, using a polyamide membrane BW30-4040 model (DOW FILMTEC), with 7.2 m² membrane area. In order to avoid damage to the equipment, this system contains a pleated polyester filter with 5 µm pore size. It was applied 8 bar until achieve 5 times of concentration, obtaining 80% of recovery of permeate, with 10 L.min-1 of concentrate flow rate. Samples of sewage pre-treated on WWTP, permeate and concentrate generated on RO was analyzed for physicochemical parameters and by gas chromatography (GC) to qualitative analysis of organic compounds. The results proved that the sewage treated on WWTP does not comply with the limit of phosphorus and nitrogen of Brazilian legislation. Besides this, it was found many organic compounds in this sewage, such as benzene, which is carcinogenic. Analyzing permeate results, it was verified that the RO as sewage tertiary treatment was efficient to remove of physicochemical parameters, achieving 100% of iron, copper, zinc and phosphorus removal, 98% of color removal, 91% of BOD and 62% of ammoniacal nitrogen. RO was capable of removing organic compounds, however, it was verified the presence of some organic compounds on de RO permeate, showing that RO did not have the capacity of removal all organic compounds of sewage. It has to be considered that permeate showed lower intensity of peaks in chromatogram in comparison to the sewage of WWTP. It is important to note that the concentrate generate on RO needs a treatment before its disposal in environment.

Keywords: organic compounds, reverse osmosis, sewage treatment, tertiary treatment

Procedia PDF Downloads 197
10768 Distribution System Planning with Distributed Generation and Capacitor Placements

Authors: Nattachote Rugthaicharoencheep

Abstract:

This paper presents a feeder reconfiguration problem in distribution systems. The objective is to minimize the system power loss and to improve bus voltage profile. The optimization problem is subjected to system constraints consisting of load-point voltage limits, radial configuration format, no load-point interruption, and feeder capability limits. A method based on genetic algorithm, a search algorithm based on the mechanics of natural selection and natural genetics, is proposed to determine the optimal pattern of configuration. The developed methodology is demonstrated by a 33-bus radial distribution system with distributed generations and feeder capacitors. The study results show that the optimal on/off patterns of the switches can be identified to give the minimum power loss while respecting all the constraints.

Keywords: network reconfiguration, distributed generation capacitor placement, loss reduction, genetic algorithm

Procedia PDF Downloads 172