Search results for: step method
20288 Impact on the Yield of Flavonoid and Total Phenolic Content from Pomegranate Fruit by Different Extraction Methods
Authors: Udeshika Yapa Bandara, Chamindri Witharana, Preethi Soysa
Abstract:
Pomegranate fruits are used in cancer treatment in Ayurveda, Sri Lanka. Due to prevailing therapeutic effects of phytochemicals, this study was focus on anti-cancer properties of the constituents in the parts of Pomegranate fruit. Furthermore, the method of extraction, plays a crucial step of the phytochemical analysis. Therefore, this study was focus on different extraction methods. Five techniques were involved for the peel and the pericarp to evaluate the most effective extraction method; Boiling with electric burner (BL), Sonication (SN), Microwaving (MC), Heating in a 50°C water bath (WB) and Sonication followed by Microwaving (SN-MC). The presence of polyphenolic and flavonoid contents were evaluated to recognize the best extraction method for polyphenols. The total phenolic content was measured spectrophotometrically by Folin-Ciocalteu method and expressed as Gallic Acid Equivalents (w/w% GAE). Total flavonoid content was also determined spectrophotometrically with Aluminium chloride colourimetric assay and expressed as Quercetin Equivalents (w/w % QE). Pomegranate juice was taken as fermented juice (with Saccharomyces bayanus) and fresh juice. Powdered seeds were refluxed, filtered and freeze-dried. 2g of freeze-dried powder of each component was dissolved in 100ml of De-ionized water for extraction. For the comparison of antioxidant activity and total phenol content, the polyphenols were removed by the Polyvinylpolypyrrolidone (PVVP) column and fermented and fresh juice were tested for the 1, 1-diphenyl-2-picrylhydrazil (DPPH) radical scavenging activity, before and after the removal of polyphenols. For the peel samples of Pomegranate fruit, total phenol and flavonoid contents were high in Sonication (SN). In pericarp, total phenol and flavonoid contents were highly exhibited in method of Sonication (SN). A significant difference was observed (P< 0.05) in total phenol and flavonoid contents, between five extraction methods for both peel and pericarp samples. Fermented juice had a greatest polyphenolic and flavonoid contents comparative to fresh juice. After removing polyphenols of fermented juice and fresh juice using Polyvinyl polypyrrolidone (PVVP) column, low antioxidant activity was resulted for DPPH antioxidant activity assay. Seeds had a very low total phenol and flavonoid contents according to the results. Although, Pomegranate peel is the main waste component of the fruit, it has an excellent polyphenolic and flavonoid contents compared to other parts of the fruit, devoid of the method of extraction. Polyphenols play a major role for antioxidant activity.Keywords: antioxidant activity, flavonoids, polyphenols, pomegranate
Procedia PDF Downloads 16120287 On the Use of Reliability Factors to Reduce Conflict between Information Sources in Dempster-Shafer Theory
Authors: A. Alem, Y. Dahmani, A. Hadjali, A. Boualem
Abstract:
Managing the problem of the conflict, either by using the Dempster-Shafer theory, or by the application of the fusion process to push researchers in recent years to find ways to get to make best decisions especially; for information systems, vision, robotic and wireless sensor networks. In this paper we are interested to take account of the conflict in the combination step that took the conflict into account and tries to manage such a way that it does not influence the decision step, the conflict what from reliable sources. According to [1], the conflict lead to erroneous decisions in cases where was with strong degrees between sources of information, if the conflict is more than the maximum of the functions of belief mass K > max1...n (mi (A)), then the decision becomes impossible. We will demonstrate in this paper that the multiplication of mass functions by coefficients of reliability is a decreasing function; it leads to the reduction of conflict and a good decision. The definition of reliability coefficients accurately and multiply them by the mass functions of each information source to resolve the conflict and allow deciding whether the degree of conflict. The evaluation of this technique is done by a use case; a comparison of the combination of springs with a maximum conflict without, and with reliability coefficients.Keywords: Dempster-Shafer theory, fusion process, conflict managing, reliability factors, decision
Procedia PDF Downloads 42620286 A Method for Modeling Flexible Manipulators: Transfer Matrix Method with Finite Segments
Authors: Haijie Li, Xuping Zhang
Abstract:
This paper presents a computationally efficient method for the modeling of robot manipulators with flexible links and joints. This approach combines the Discrete Time Transfer Matrix Method with the Finite Segment Method, in which the flexible links are discretized by a number of rigid segments connected by torsion springs; and the flexibility of joints are modeled by torsion springs. The proposed method avoids the global dynamics and has the advantage of modeling non-uniform manipulators. Experiments and simulations of a single-link flexible manipulator are conducted for verifying the proposed methodologies. The simulations of a three-link robot arm with links and joints flexibility are also performed.Keywords: flexible manipulator, transfer matrix method, linearization, finite segment method
Procedia PDF Downloads 43020285 Blue Hydrogen Production Via Catalytic Aquathermolysis Coupled with Direct Carbon Dioxide Capture Via Adsorption
Authors: Sherif Fakher
Abstract:
Hydrogen has been gaining a lot of global attention as an uprising contributor in the energy sector. Labeled as an energy carrier, hydrogen is used in many industries and can be used to generate electricity via fuel cells. Blue hydrogen involves the production of hydrogen from hydrocarbons using different processes that emit CO₂. However, the CO₂ is captured and stored. Hence, very little environmental damage occurs during the hydrogen production process. This research investigates the ability to use different catalysts for the production of hydrogen from different hydrocarbon sources, including coal, oil, and gas, using a two-step Aquathermolysis reaction. The research presents the results of experiments conducted to evaluate different catalysts and also highlights the main advantages of this process over other blue hydrogen production methods, including methane steam reforming, autothermal reforming, and oxidation. Two methods of hydrogen generation were investigated including partial oxidation and aquathermolysis. For those two reactions, the reaction kinetics, thermodynamics, and medium were all investigated. Following this, experiments were conducted to test the hydrogen generation potential from both methods. The porous media tested were sandstone, ash, and prozzolanic material. The spent oils used were spent motor oil and spent vegetable oil from cooking. Experiments were conducted at temperatures up to 250 C and pressures up to 3000 psi. Based on the experimental results, mathematical models were developed to predict the hydrogen generation potential at higher thermodynamic conditions. Since both partial oxidation and aquathermolysis require relatively high temperatures to undergo, it was important to devise a method by which these high temperatures can be generated at a low cost. This was done by investigating two factors, including the porous media used and the reliance on the spent oil. Of all the porous media used, the ash had the highest thermal conductivity. The second step was the partial combustion of part of the spent oil to generate the heat needed to reach the high temperatures. This reduced the cost of the heat generation significantly. For the partial oxidation reaction, the spent oil was burned in the presence of a limited oxygen concentration to generate carbon monoxide. The main drawback of this process was the need for burning. This resulted in the generation of other harmful and environmentally damaging gases. Aquathermolysis does not rely on burning, which makes it the cleaner alternative. However, it needs much higher temperatures to run the reaction. When comparing the hydrogen generation potential for both using gas chromatography, aquathermolysis generated 23% more hydrogen using the same volume of spent oil compared to partial oxidation. This research introduces the concept of using spent oil for hydrogen production. This can be a very promising method to produce a clean source of energy using a waste product. This can also help reduce the reliance on freshwater for hydrogen generation which can divert the usage of freshwater to other more important applications.Keywords: blue hydrogen production, catalytic aquathermolysis, direct carbon dioxide capture, CCUS
Procedia PDF Downloads 3120284 Evaluation of Negative Air Ions in Bioaerosol Removal: Indoor Concentration of Airborne Bacterial and Fungal in Residential Building in Qom City, Iran
Authors: Z. Asadgol, A. Nadali, H. Arfaeinia, M. Khalifeh Gholi, R. Fateh, M. Fahiminia
Abstract:
The present investigation was conducted to detect the type and concentrations of bacterial and fungal bioaerosols in one room (bedroom) of each selected residential building located in different regions of Qom during February 2015 (n=9) to July 2016 (n=11). Moreover, we evaluated the efficiency of negative air ions (NAIs) in bioaerosol reduction in indoor air in residential buildings. In the first step, the mean concentrations of bacterial and fungal in nine sampling sites evaluated in winter were 744 and 579 colony forming units (CFU)/m3, while these values were 1628.6 and 231 CFU/m3 in the 11 sampling sites evaluated in summer, respectively. The most predominant genera between bacterial and fungal in all sampling sites were detected as Micrococcus spp. and Staphylococcus spp. and also, Aspergillus spp. and Penicillium spp., respectively. The 95% and 45% of sampling sites have bacterial and fungal concentrations over the recommended levels, respectively. In the removal step, we achieved a reduction with a range of 38% to 93% for bacterial genera and 25% to 100% for fungal genera by using NAIs. The results suggested that NAI is a highly effective, simple and efficient technique in reducing the bacterial and fungal concentration in the indoor air of residential buildings.Keywords: bacterial, fungal, negative air ions (NAIs), indoor air, Iran
Procedia PDF Downloads 40520283 Cell Biomass and Lipid Productivities of Meyerella planktonica under Autotrophic and Heterotrophic Growth Conditions
Authors: Rory Anthony Hutagalung, Leonardus Widjaja
Abstract:
Microalgae Meyerella planktonica is a potential biofuel source because it can grow in bulk in either autotrophic or heterotrophic condition. However, the quantitative growth of this algal type is still low as it tends to precipitates on the bottom. Beside, the lipid concentration is still low when grown in autotrophic condition. In contrast, heterotrophic condition can enhance the lipid concentration. The combination of autotrophic condition and agitation treatment was conducted to increase the density of the culture. On the other hand, a heterotrophic condition was set up to raise the lipid production. A two-stage experiment was applied to increase the density at the first step and to increase the lipid concentration in the next step. The autotrophic condition resulted higher density but lower lipid concentration compared to heterotrophic one. The agitation treatment produced higher density in both autotrophic and heterotrophic conditions. The two-stage experiment managed to enhance the density during the autotrophic stage and the lipid concentration during the heterotrophic stage. The highest yield was performed by using 0.4% v/v glycerol as a carbon source (2.9±0.016 x 106 cells w/w) attained 7 days after the heterotrophic stage began. The lipid concentration was stable starting from day 7.Keywords: agitation, glycerol, heterotrophic, lipid productivity, Meyerella planktonica
Procedia PDF Downloads 33720282 Pneumoperitoneum Creation Assisted with Optical Coherence Tomography and Automatic Identification
Authors: Eric Yi-Hsiu Huang, Meng-Chun Kao, Wen-Chuan Kuo
Abstract:
For every laparoscopic surgery, a safe pneumoperitoneumcreation (gaining access to the peritoneal cavity) is the first and essential step. However, closed pneumoperitoneum is usually obtained by blind insertion of a Veress needle into the peritoneal cavity, which may carry potential risks suchas bowel and vascular injury.Until now, there remains no definite measure to visually confirm the position of the needle tip inside the peritoneal cavity. Therefore, this study established an image-guided Veress needle method by combining a fiber probe with optical coherence tomography (OCT). An algorithm was also proposed for determining the exact location of the needle tip through the acquisition of OCT images. Our method not only generates a series of “live” two-dimensional (2D) images during the needle puncture toward the peritoneal cavity but also can eliminate operator variation in image judgment, thus improving peritoneal access safety. This study was approved by the Ethics Committee of Taipei Veterans General Hospital (Taipei VGH IACUC 2020-144). A total of 2400 in vivo OCT images, independent of each other, were acquired from experiments of forty peritoneal punctures on two piglets. Characteristic OCT image patterns could be observed during the puncturing process. The ROC curve demonstrates the discrimination capability of these quantitative image features of the classifier, showing the accuracy of the classifier for determining the inside vs. outside of the peritoneal was 98% (AUC=0.98). In summary, the present study demonstrates the ability of the combination of our proposed automatic identification method and OCT imaging for automatically and objectively identifying the location of the needle tip. OCT images translate the blind closed technique of peritoneal access into a visualized procedure, thus improving peritoneal access safety.Keywords: pneumoperitoneum, optical coherence tomography, automatic identification, veress needle
Procedia PDF Downloads 13420281 Optimizing Hydrogen Production from Biomass Pyro-Gasification in a Multi-Staged Fluidized Bed Reactor
Authors: Chetna Mohabeer, Luis Reyes, Lokmane Abdelouahed, Bechara Taouk
Abstract:
In the transition to sustainability and the increasing use of renewable energy, hydrogen will play a key role as an energy carrier. Biomass has the potential to accelerate the realization of hydrogen as a major fuel of the future. Pyro-gasification allows the conversion of organic matter mainly into synthesis gas, or “syngas”, majorly constituted by CO, H2, CH4, and CO2. A second, condensable fraction of biomass pyro-gasification products are “tars”. Under certain conditions, tars may decompose into hydrogen and other light hydrocarbons. These conditions include two types of cracking: homogeneous cracking, where tars decompose under the effect of temperature ( > 1000 °C), and heterogeneous cracking, where catalysts such as olivine, dolomite or biochar are used. The latter process favors cracking of tars at temperatures close to pyro-gasification temperatures (~ 850 °C). Pyro-gasification of biomass coupled with water-gas shift is the most widely practiced process route for biomass to hydrogen today. In this work, an innovating solution will be proposed for this conversion route, in that all the pyro-gasification products, not only methane, will undergo processes that aim to optimize hydrogen production. First, a heterogeneous cracking step was included in the reaction scheme, using biochar (remaining solid from the pyro-gasification reaction) as catalyst and CO2 and H2O as gasifying agents. This process was followed by a catalytic steam methane reforming (SMR) step. For this, a Ni-based catalyst was tested under different reaction conditions to optimize H2 yield. Finally, a water-gas shift (WGS) reaction step with a Fe-based catalyst was added to optimize the H2 yield from CO. The reactor used for cracking was a fluidized bed reactor, and the one used for SMR and WGS was a fixed bed reactor. The gaseous products were analyzed continuously using a µ-GC (Fusion PN 074-594-P1F). With biochar as bed material, it was seen that more H2 was obtained with steam as a gasifying agent (32 mol. % vs. 15 mol. % with CO2 at 900 °C). CO and CH4 productions were also higher with steam than with CO2. Steam as gasifying agent and biochar as bed material were hence deemed efficient parameters for the first step. Among all parameters tested, CH4 conversions approaching 100 % were obtained from SMR reactions using Ni/γ-Al2O3 as a catalyst, 800 °C, and a steam/methane ratio of 5. This gave rise to about 45 mol % H2. Experiments about WGS reaction are currently being conducted. At the end of this phase, the four reactions are performed consecutively, and the results analyzed. The final aim is the development of a global kinetic model of the whole system in a multi-stage fluidized bed reactor that can be transferred on ASPEN PlusTM.Keywords: multi-staged fluidized bed reactor, pyro-gasification, steam methane reforming, water-gas shift
Procedia PDF Downloads 13820280 Dynamic Response Analysis of Structure with Random Parameters
Authors: Ahmed Guerine, Ali El Hafidi, Bruno Martin, Philippe Leclaire
Abstract:
In this paper, we propose a method for the dynamic response of multi-storey structures with uncertain-but-bounded parameters. The effectiveness of the proposed method is demonstrated by a numerical example of three-storey structures. This equation is integrated numerically using Newmark’s method. The numerical results are obtained by the proposed method. The simulation accounting the interval analysis method results are compared with a probabilistic approach results. The interval analysis method provides a mean curve that is between an upper and lower bound obtained from the probabilistic approach.Keywords: multi-storey structure, dynamic response, interval analysis method, random parameters
Procedia PDF Downloads 19020279 A New Approach to Image Stitching of Radiographic Images
Authors: Somaya Adwan, Rasha Majed, Lamya'a Majed, Hamzah Arof
Abstract:
In order to produce images with whole body parts, X-ray of different portions of the body parts is assembled using image stitching methods. A new method for image stitching that exploits mutually feature based method and direct based method to identify and merge pairs of X-ray medical images is presented in this paper. The performance of the proposed method based on this hybrid approach is investigated in this paper. The ability of the proposed method to stitch and merge the overlapping pairs of images is demonstrated. Our proposed method display comparable if not superior performance to other feature based methods that are mentioned in the literature on the standard databases. These results are promising and demonstrate the potential of the proposed method for further development to tackle more advanced stitching problems.Keywords: image stitching, direct based method, panoramic image, X-ray
Procedia PDF Downloads 54120278 Need for Cognition: An Important, Neglected Personality Variable in the Development of Spirituality Within the Context of Twelve Step Recovery from Addictive Disorders
Authors: Paul E. Priester
Abstract:
The Twelve Step approach to recovery from substance use and addictive disorders is considered an evidence-based model that assists many who recover from a chronic, progressive, fatal disease. Two key processes that contribute to the success of obtaining recovery from substance use disorders (SUD) are meeting engagement and the development of spiritual beliefs. Beyond establishing that there is a positive relationship between the development of spiritual beliefs in recovery from SUD’s, there has been a paucity of research exploring individual differences among individuals in this development of spiritual beliefs. One such personality variable that deserves exploration is that of the need for cognition. The need for cognition is a personality variable that explains the cognitive style of individuals. Individuals with a high need for cognition enjoy examining the complexities of a situation before coming to a conclusion. While individuals with a low need for cognition do not value or spend time cognitively dissecting a situation or decision. It is important to point out that a high need for cognition does not necessarily imply a high level of cognitive ability. Indeed, one could make the argument that a low need for cognition individual is not “wasting” cognitive energy in perseverating the multitude of aspects of a particular decision. This paper will present two case studies demonstrating the development of spiritual beliefs that enabled long-term recovery from SUD. The first case study presents an agnostic individual with a low need for cognition cognitive style in his development of spirituality in support of his recovery from alcoholism within the context of Alcoholics Anonymous. The second case study represents an adamant atheist with a high need for cognition cognitive style. This second individual is an intravenous cocaine addict and alcoholic who recovers through the development of spirituality within the contexts of Alcoholics Anonymous and Narcotics Anonymous. The two case studies will be contrasted with each other, noting how the individuals’ cognitive style mediated the development of spirituality that supported their long-term recovery from alcoholism and addiction.Keywords: spirituality, twelve step recovery, need for cognition, individual differences in recovery from addictions
Procedia PDF Downloads 9320277 Sensing of Cancer DNA Using Resonance Frequency
Authors: Sungsoo Na, Chanho Park
Abstract:
Lung cancer is one of the most common severe diseases driving to the death of a human. Lung cancer can be divided into two cases of small-cell lung cancer (SCLC) and non-SCLC (NSCLC), and about 80% of lung cancers belong to the case of NSCLC. From several studies, the correlation between epidermal growth factor receptor (EGFR) and NSCLCs has been investigated. Therefore, EGFR inhibitor drugs such as gefitinib and erlotinib have been used as lung cancer treatments. However, the treatments result showed low response (10~20%) in clinical trials due to EGFR mutations that cause the drug resistance. Patients with resistance to EGFR inhibitor drugs usually are positive to KRAS mutation. Therefore, assessment of EGFR and KRAS mutation is essential for target therapies of NSCLC patient. In order to overcome the limitation of conventional therapies, overall EGFR and KRAS mutations have to be monitored. In this work, the only detection of EGFR will be presented. A variety of techniques has been presented for the detection of EGFR mutations. The standard detection method of EGFR mutation in ctDNA relies on real-time polymerase chain reaction (PCR). Real-time PCR method provides high sensitive detection performance. However, as the amplification step increases cost effect and complexity increase as well. Other types of technology such as BEAMing, next generation sequencing (NGS), an electrochemical sensor and silicon nanowire field-effect transistor have been presented. However, those technologies have limitations of low sensitivity, high cost and complexity of data analyzation. In this report, we propose a label-free and high-sensitive detection method of lung cancer using quartz crystal microbalance based platform. The proposed platform is able to sense lung cancer mutant DNA with a limit of detection of 1nM.Keywords: cancer DNA, resonance frequency, quartz crystal microbalance, lung cancer
Procedia PDF Downloads 23320276 The Brain’s Attenuation Coefficient as a Potential Estimator of Temperature Elevation during Intracranial High Intensity Focused Ultrasound Procedures
Authors: Daniel Dahis, Haim Azhari
Abstract:
Noninvasive image-guided intracranial treatments using high intensity focused ultrasound (HIFU) are on the course of translation into clinical applications. They include, among others, tumor ablation, hyperthermia, and blood-brain-barrier (BBB) penetration. Since many of these procedures are associated with local temperature elevation, thermal monitoring is essential. MRI constitutes an imaging method with high spatial resolution and thermal mapping capacity. It is the currently leading modality for temperature guidance, commonly under the name MRgHIFU (magnetic-resonance guided HIFU). Nevertheless, MRI is a very expensive non-portable modality which jeopardizes its accessibility. Ultrasonic thermal monitoring, on the other hand, could provide a modular, cost-effective alternative with higher temporal resolution and accessibility. In order to assess the feasibility of ultrasonic brain thermal monitoring, this study investigated the usage of brain tissue attenuation coefficient (AC) temporal changes as potential estimators of thermal changes. Newton's law of cooling describes a temporal exponential decay behavior for the temperature of a heated object immersed in a relatively cold surrounding. Similarly, in the case of cerebral HIFU treatments, the temperature in the region of interest, i.e., focal zone, is suggested to follow the same law. Thus, it was hypothesized that the AC of the irradiated tissue may follow a temporal exponential behavior during cool down regime. Three ex-vivo bovine brain tissue specimens were inserted into plastic containers along with four thermocouple probes in each sample. The containers were placed inside a specially built ultrasonic tomograph and scanned at room temperature. The corresponding pixel-averaged AC was acquired for each specimen and used as a reference. Subsequently, the containers were placed in a beaker containing hot water and gradually heated to about 45ᵒC. They were then repeatedly rescanned during cool down using ultrasonic through-transmission raster trajectory until reaching about 30ᵒC. From the obtained images, the normalized AC and its temporal derivative as a function of temperature and time were registered. The results have demonstrated high correlation (R² > 0.92) between both the brain AC and its temporal derivative to temperature. This indicates the validity of the hypothesis and the possibility of obtaining brain tissue temperature estimation from the temporal AC thermal changes. It is important to note that each brain yielded different AC values and slopes. This implies that a calibration step is required for each specimen. Thus, for a practical acoustic monitoring of the brain, two steps are suggested. The first step consists of simply measuring the AC at normal body temperature. The second step entails measuring the AC after small temperature elevation. In face of the urging need for a more accessible thermal monitoring technique for brain treatments, the proposed methodology enables a cost-effective high temporal resolution acoustical temperature estimation during HIFU treatments.Keywords: attenuation coefficient, brain, HIFU, image-guidance, temperature
Procedia PDF Downloads 16120275 Numerical Solutions of Generalized Burger-Fisher Equation by Modified Variational Iteration Method
Authors: M. O. Olayiwola
Abstract:
Numerical solutions of the generalized Burger-Fisher are obtained using a Modified Variational Iteration Method (MVIM) with minimal computational efforts. The computed results with this technique have been compared with other results. The present method is seen to be a very reliable alternative method to some existing techniques for such nonlinear problems.Keywords: burger-fisher, modified variational iteration method, lagrange multiplier, Taylor’s series, partial differential equation
Procedia PDF Downloads 43020274 Mapping Iron Content in the Brain with Magnetic Resonance Imaging and Machine Learning
Authors: Gabrielle Robertson, Matthew Downs, Joseph Dagher
Abstract:
Iron deposition in the brain has been linked with a host of neurological disorders such as Alzheimer’s, Parkinson’s, and Multiple Sclerosis. While some treatment options exist, there are no objective measurement tools that allow for the monitoring of iron levels in the brain in vivo. An emerging Magnetic Resonance Imaging (MRI) method has been recently proposed to deduce iron concentration through quantitative measurement of magnetic susceptibility. This is a multi-step process that involves repeated modeling of physical processes via approximate numerical solutions. For example, the last two steps of this Quantitative Susceptibility Mapping (QSM) method involve I) mapping magnetic field into magnetic susceptibility and II) mapping magnetic susceptibility into iron concentration. Process I involves solving an ill-posed inverse problem by using regularization via injection of prior belief. The end result from Process II highly depends on the model used to describe the molecular content of each voxel (type of iron, water fraction, etc.) Due to these factors, the accuracy and repeatability of QSM have been an active area of research in the MRI and medical imaging community. This work aims to estimate iron concentration in the brain via a single step. A synthetic numerical model of the human head was created by automatically and manually segmenting the human head on a high-resolution grid (640x640x640, 0.4mm³) yielding detailed structures such as microvasculature and subcortical regions as well as bone, soft tissue, Cerebral Spinal Fluid, sinuses, arteries, and eyes. Each segmented region was then assigned tissue properties such as relaxation rates, proton density, electromagnetic tissue properties and iron concentration. These tissue property values were randomly selected from a Probability Distribution Function derived from a thorough literature review. In addition to having unique tissue property values, different synthetic head realizations also possess unique structural geometry created by morphing the boundary regions of different areas within normal physical constraints. This model of the human brain is then used to create synthetic MRI measurements. This is repeated thousands of times, for different head shapes, volume, tissue properties and noise realizations. Collectively, this constitutes a training-set that is similar to in vivo data, but larger than datasets available from clinical measurements. This 3D convolutional U-Net neural network architecture was used to train data-driven Deep Learning models to solve for iron concentrations from raw MRI measurements. The performance was then tested on both synthetic data not used in training as well as real in vivo data. Results showed that the model trained on synthetic MRI measurements is able to directly learn iron concentrations in areas of interest more effectively than other existing QSM reconstruction methods. For comparison, models trained on random geometric shapes (as proposed in the Deep QSM method) are less effective than models trained on realistic synthetic head models. Such an accurate method for the quantitative measurement of iron deposits in the brain would be of important value in clinical studies aiming to understand the role of iron in neurological disease.Keywords: magnetic resonance imaging, MRI, iron deposition, machine learning, quantitative susceptibility mapping
Procedia PDF Downloads 13720273 Study Properties of Bamboo Composite after Treatment Surface by Chemical Method
Authors: Kiatnarong Supapanmanee, Ekkarin Phongphinittana, Pongsak Nimdum
Abstract:
Natural fibers are readily available raw materials that are widely used as composite materials. The most common problem facing many researchers with composites made from this fiber is the adhesion between the natural fiber contact surface and the matrix material. Part of the problem is due to the hydrophilic properties of natural fibers and the hydrophobic properties of the matrix material. Based on the aforementioned problems, this research selected bamboo fiber, which is a strong natural fiber in the research study. The first step was to study the effect of the mechanical properties of the pure bamboo strip by testing the tensile strength of different measurement lengths. The bamboo strip was modified surface with sodium hydroxide (NaOH) at 6wt% concentrations for different soaking periods. After surface modification, the physical and mechanical properties of the pure bamboo strip fibers were studied. The modified and unmodified bamboo strips were molded into a composite material using epoxy as a matrix to compare the mechanical properties and adhesion between the fiber surface and the material with tensile and bending tests. In addition, the results of these tests were compared with the finite element method (FEM). The results showed that the length of the bamboo strip affects the strength of the fibers, with shorter fibers causing higher tensile stress. Effects of surface modification of bamboo strip with NaOH, this chemical eliminates lignin and hemicellulose, resulting in the smaller dimension of the bamboo strip and increased density. From the pretreatment results above, it was found that the treated bamboo strip and composite material had better Ultimate tensile stress and Young's modulus. Moreover, that results in better adhesion between bamboo fiber and matrix material.Keywords: bamboo fiber, bamboo strip, composite material, bamboo composite, pure bamboo, surface modification, mechanical properties of bamboo, bamboo finite element method
Procedia PDF Downloads 9220272 Post-Earthquake Road Damage Detection by SVM Classification from Quickbird Satellite Images
Authors: Moein Izadi, Ali Mohammadzadeh
Abstract:
Detection of damaged parts of roads after earthquake is essential for coordinating rescuers. In this study, an approach is presented for the semi-automatic detection of damaged roads in a city using pre-event vector maps and both pre- and post-earthquake QuickBird satellite images. Damage is defined in this study as the debris of damaged buildings adjacent to the roads. Some spectral and texture features are considered for SVM classification step to detect damages. Finally, the proposed method is tested on QuickBird pan-sharpened images from the Bam City earthquake and the results show that an overall accuracy of 81% and a kappa coefficient of 0.71 are achieved for the damage detection. The obtained results indicate the efficiency and accuracy of the proposed approach.Keywords: SVM classifier, disaster management, road damage detection, quickBird images
Procedia PDF Downloads 62320271 Spectral Domain Fast Multipole Method for Solving Integral Equations of One and Two Dimensional Wave Scattering
Authors: Mohammad Ahmad, Dayalan Kasilingam
Abstract:
In this paper, a spectral domain implementation of the fast multipole method is presented. It is shown that the aggregation, translation, and disaggregation stages of the fast multipole method (FMM) can be performed using the spectral domain (SD) analysis. The spectral domain fast multipole method (SD-FMM) has the advantage of eliminating the near field/far field classification used in conventional FMM formulation. The study focuses on the application of SD-FMM to one-dimensional (1D) and two-dimensional (2D) electric field integral equation (EFIE). The case of perfectly conducting strip, circular and square cylinders are numerically analyzed and compared with the results from the standard method of moments (MoM).Keywords: electric field integral equation, fast multipole method, method of moments, wave scattering, spectral domain
Procedia PDF Downloads 40620270 Analytical Method Development and Validation of Stability Indicating Rp - Hplc Method for Detrmination of Atorvastatin and Methylcobalamine
Authors: Alkaben Patel
Abstract:
The proposed RP-HPLC method is easy, rapid, economical, precise and accurate stability indicating RP-HPLC method for simultaneous estimation of Astorvastatin and Methylcobalamine in their combined dosage form has been developed.The separation was achieved by LC-20 AT C18(250mm*4.6mm*2.6mm)Colum and water (pH 3.5): methanol 70:30 as mobile phase, at a flow rate of 1ml/min. wavelength of this dosage form is 215nm.The drug is related to stress condition of hydrolysis, oxidation, photolysis and thermal degradation.Keywords: RP- HPLC, atorvastatin, methylcobalamine, method, development, validation
Procedia PDF Downloads 33620269 Power Series Solution to Sliding Velocity in Three-Dimensional Multibody Systems with Impact and Friction
Authors: Hesham A. Elkaranshawy, Amr M. Abdelrazek, Hosam M. Ezzat
Abstract:
The system of ordinary nonlinear differential equations describing sliding velocity during impact with friction for a three-dimensional rigid-multibody system is developed. No analytical solutions have been obtained before for this highly nonlinear system. Hence, a power series solution is proposed. Since the validity of this solution is limited to its convergence zone, a suitable time step is chosen and at the end of it a new series solution is constructed. For a case study, the trajectory of the sliding velocity using the proposed method is built using 6 time steps, which coincides with a Runge-Kutta solution using 38 time steps.Keywords: impact with friction, nonlinear ordinary differential equations, power series solutions, rough collision
Procedia PDF Downloads 48820268 Police and Crime Scene Management Model
Authors: Najaf Hamadzadeh Arbabi
Abstract:
Crime scene management is the first and most critical step in criminal investigations and all the criminal investigations are based on the ability of the crime scene investigation officers for diagnosing the importance and the role of physical evidence at the crime scene. According to the role of available physical evidence at the scene to prove the crime and identify the perpetrator and prove the innocence of those accused have been unduly and also impossible due to the rejection of these reasons, the maintenance and investigation of crime scene and collect evidence are very important in the crime scene. This research, by identifying the factors affecting the management of the crime scene, looking for presenting the effective and efficient indigenous pattern for managing of the crime scene in Iran. Method: This study is an applied and development research. Wilcoxon signed-rank test and the Friedman test for ranking, were used for analyzing the data and all hypotheses were tested at 95% confidence level. The target population is 50 judges and experts in Tehran.Keywords: crime scene, identification, designation, individualization, reconstruction
Procedia PDF Downloads 27620267 Application of Lattice Boltzmann Method to Different Boundary Conditions in a Two Dimensional Enclosure
Authors: Jean Yves Trepanier, Sami Ammar, Sagnik Banik
Abstract:
Lattice Boltzmann Method has been advantageous in simulating complex boundary conditions and solving for fluid flow parameters by streaming and collision processes. This paper includes the study of three different test cases in a confined domain using the method of the Lattice Boltzmann model. 1. An SRT (Single Relaxation Time) approach in the Lattice Boltzmann model is used to simulate Lid Driven Cavity flow for different Reynolds Number (100, 400 and 1000) with a domain aspect ratio of 1, i.e., square cavity. A moment-based boundary condition is used for more accurate results. 2. A Thermal Lattice BGK (Bhatnagar-Gross-Krook) Model is developed for the Rayleigh Benard convection for both test cases - Horizontal and Vertical Temperature difference, considered separately for a Boussinesq incompressible fluid. The Rayleigh number is varied for both the test cases (10^3 ≤ Ra ≤ 10^6) keeping the Prandtl number at 0.71. A stability criteria with a precise forcing scheme is used for a greater level of accuracy. 3. The phase change problem governed by the heat-conduction equation is studied using the enthalpy based Lattice Boltzmann Model with a single iteration for each time step, thus reducing the computational time. A double distribution function approach with D2Q9 (density) model and D2Q5 (temperature) model are used for two different test cases-the conduction dominated melting and the convection dominated melting. The solidification process is also simulated using the enthalpy based method with a single distribution function using the D2Q5 model to provide a better understanding of the heat transport phenomenon. The domain for the test cases has an aspect ratio of 2 with some exceptions for a square cavity. An approximate velocity scale is chosen to ensure that the simulations are within the incompressible regime. Different parameters like velocities, temperature, Nusselt number, etc. are calculated for a comparative study with the existing works of literature. The simulated results demonstrate excellent agreement with the existing benchmark solution within an error limit of ± 0.05 implicates the viability of this method for complex fluid flow problems.Keywords: BGK, Nusselt, Prandtl, Rayleigh, SRT
Procedia PDF Downloads 12820266 Investigation and Optimization of DNA Isolation Efficiency Using Ferrite-Based Magnetic Nanoparticles
Authors: Tímea Gerzsenyi, Ágnes M. Ilosvai, László Vanyorek, Emma Szőri-Dorogházi
Abstract:
DNA isolation is a crucial step in many molecular biological applications for diagnostic and research purposes. However, traditional extraction requires toxic reagents, and commercially available kits are expensive, this leading to the recently wide-spread method, the magnetic nanoparticle (MNP)-based DNA isolation. Different ferrite containing MNPs were examined and compared in their plasmid DNA isolation efficiency. Among the tested MNPs, one has never been used for the extraction of plasmid molecules, marking a distinct application. pDNA isolation process was optimized for each type of nanoparticle and the best protocol was selected based on different criteria: DNA quantity, quality and integrity. With the best-performing magnetic nanoparticle, which excelled in all aspects, further tests were performed to recover genomic DNA from bacterial cells and a protocol was developed.Keywords: DNA isolation, nanobiotechnology, magnetic nanoparticles, protocol optimization, pDNA, gDNA
Procedia PDF Downloads 1220265 Acceleration Techniques of DEM Simulation for Dynamics of Particle Damping
Authors: Masato Saeki
Abstract:
Presented herein is a novel algorithms for calculating the damping performance of particle dampers. The particle damper is a passive vibration control technique and has many practical applications due to simple design. It consists of granular materials constrained to move between two ends in the cavity of a primary vibrating system. The damping effect results from the exchange of momentum during the impact of granular materials against the wall of the cavity. This damping has the advantage of being independent of the environment. Therefore, particle damping can be applied in extreme temperature environments, where most conventional dampers would fail. It was shown experimentally in many papers that the efficiency of the particle dampers is high in the case of resonant vibration. In order to use the particle dampers effectively, it is necessary to solve the equations of motion for each particle, considering the granularity. The discrete element method (DEM) has been found to be effective for revealing the dynamics of particle damping. In this method, individual particles are assumed as rigid body and interparticle collisions are modeled by mechanical elements as springs and dashpots. However, the computational cost is significant since the equation of motion for each particle must be solved at each time step. In order to improve the computational efficiency of the DEM, the new algorithms are needed. In this study, new algorithms are proposed for implementing the high performance DEM. On the assumption that behaviors of the granular particles in the each divided area of the damper container are the same, the contact force of the primary system with all particles can be considered to be equal to the product of the divided number of the damper area and the contact force of the primary system with granular materials per divided area. This convenience makes it possible to considerably reduce the calculation time. The validity of this calculation method was investigated and the calculated results were compared with the experimental ones. This paper also presents the results of experimental studies of the performance of particle dampers. It is shown that the particle radius affect the noise level. It is also shown that the particle size and the particle material influence the damper performance.Keywords: particle damping, discrete element method (DEM), granular materials, numerical analysis, equivalent noise level
Procedia PDF Downloads 45320264 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model
Authors: Donatella Giuliani
Abstract:
In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation
Procedia PDF Downloads 21720263 Purification, Extraction and Visualization of Lipopolysaccharide of Escherichia coli from Urine Samples of Patients with Urinary Tract Infection
Authors: Fariha Akhter Chowdhury, Mohammad Nurul Islam, Anamika Saha, Sabrina Mahboob, Abu Syed Md. Mosaddek, Md. Omar Faruque, Most. Fahmida Begum, Rajib Bhattacharjee
Abstract:
Urinary tract infection (UTI) is one of the most common infectious diseases in Bangladesh where Escherichia coli is the prevalent organism and responsible for most of the infections. Lipopolysaccharide (LPS) is known to act as a major virulence factor of E. coli. The present study aimed to purify, extract and visualize LPS of E. coli clinical isolates from urine samples of patients with UTI. The E. coli strain was isolated from the urine samples of 10 patients with UTI and then the antibiotic sensitivity pattern of the isolates was determined. The purification of LPS was carried out using the hot aqueous-phenol method and separated by sodium dodecyl sulfate polyacrylamide gel electrophoresis, which was directly stained using the modified silver staining method and Coomassie blue. The silver-stained gel demonstrated both smooth and rough type LPS by showing trail-like band patterns with the presence and lacking O-antigen region, respectively. Coomassie blue staining showed no band assuring the absence of any contaminating protein. Our successful extraction of purified LPS from E. coli isolates of UTI patients’ urine samples can be an important step to understand the UTI disease conditions.Keywords: Escherichia coli, electrophoresis, polyacrylamide gel, silver staining, sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE)
Procedia PDF Downloads 38920262 A More Powerful Test Procedure for Multiple Hypothesis Testing
Authors: Shunpu Zhang
Abstract:
We propose a new multiple test called the minPOP test for testing multiple hypotheses simultaneously. Under the assumption that the test statistics are independent, we show that the minPOP test has higher global power than the existing multiple testing methods. We further propose a stepwise multiple-testing procedure based on the minPOP test and two of its modified versions (the Double Truncated and Left Truncated minPOP tests). We show that these multiple tests have strong control of the family-wise error rate (FWER). A method for finding the p-values of the proposed tests after adjusting for multiplicity is also developed. Simulation results show that the Double Truncated and Left Truncated minPOP tests, in general, have a higher number of rejections than the existing multiple testing procedures.Keywords: multiple test, single-step procedure, stepwise procedure, p-value for multiple testing
Procedia PDF Downloads 8320261 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising
Authors: Jianwei Ma, Diriba Gemechu
Abstract:
In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.Keywords: anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, split Bregman algorithm
Procedia PDF Downloads 20720260 Development of New Technology Evaluation Model by Using Patent Information and Customers' Review Data
Authors: Kisik Song, Kyuwoong Kim, Sungjoo Lee
Abstract:
Many global firms and corporations derive new technology and opportunity by identifying vacant technology from patent analysis. However, previous studies failed to focus on technologies that promised continuous growth in industrial fields. Most studies that derive new technology opportunities do not test practical effectiveness. Since previous studies depended on expert judgment, it became costly and time-consuming to evaluate new technologies based on patent analysis. Therefore, research suggests a quantitative and systematic approach to technology evaluation indicators by using patent data to and from customer communities. The first step involves collecting two types of data. The data is used to construct evaluation indicators and apply these indicators to the evaluation of new technologies. This type of data mining allows a new method of technology evaluation and better predictor of how new technologies are adopted.Keywords: data mining, evaluating new technology, technology opportunity, patent analysis
Procedia PDF Downloads 37720259 RPM-Synchronous Non-Circular Grinding: An Approach to Enhance Efficiency in Grinding of Non-Circular Workpieces
Authors: Matthias Steffan, Franz Haas
Abstract:
The production process grinding is one of the latest steps in a value-added manufacturing chain. Within this step, workpiece geometry and surface roughness are determined. Up to this process stage, considerable costs and energy have already been spent on components. According to the current state of the art, therefore, large safety reserves are calculated in order to guarantee a process capability. Especially for non-circular grinding, this fact leads to considerable losses of process efficiency. With present technology, various non-circular geometries on a workpiece must be grinded subsequently in an oscillating process where X- and Q-axis of the machine are coupled. With the approach of RPM-Synchronous Noncircular Grinding, such workpieces can be machined in an ordinary plung grinding process. Therefore, the workpieces and the grinding wheels revolutionary rate are in a fixed ratio. A non-circular grinding wheel is used to transfer its geometry onto the workpiece. The authors use a worldwide unique machine tool that was especially designed for this technology. Highest revolution rates on the workpiece spindle (up to 4500 rpm) are mandatory for the success of this grinding process. This grinding approach is performed in a two-step process. For roughing, a highly porous vitrified bonded grinding wheel with medium grain size is used. It ensures high specific material removal rates for efficiently producing the non-circular geometry on the workpiece. This process step is adapted by a force control algorithm, which uses acquired data from a three-component force sensor located in the dead centre of the tailstock. For finishing, a grinding wheel with a fine grain size is used. Roughing and finishing are performed consecutively among the same clamping of the workpiece with two locally separated grinding spindles. The approach of RPM-Synchronous Noncircular Grinding shows great efficiency enhancement in non-circular grinding. For the first time, three-dimensional non-circular shapes can be grinded that opens up various fields of application. Especially automotive industries show big interest in the emerging trend in finishing machining.Keywords: efficiency enhancement, finishing machining, non-circular grinding, rpm-synchronous grinding
Procedia PDF Downloads 283