Search results for: ultrasound extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2365

Search results for: ultrasound extraction

1645 A Rare Case Report of Wandering Spleen Torsion

Authors: Steven Robinson, Adriana Dager, Param Patel

Abstract:

Wandering spleen is a rare variant where there is abnormal development of the ligamentous peritoneal attachments of the spleen which normally anchor it in the left upper quadrant of the abdomen. Ligamentous abnormalities can be congenital, or acquired through pregnancy, injury, or iatrogenic causes. Absence or laxity of these ligaments allows migration of the spleen into ectopic portions of the abdomen, which is also associated with an elongated vascular pedicle. Incidence of wandering spleen is reported at less than 0.25% with a female to male ratio of approximately 6:1. The most common complication of a wandering spleen is torsion around its vascular pedicle which can lead to thrombosis and infarction. Torsion of a wandering spleen is a rare but important cause of an acute abdomen. Imaging, and specifically CT or ultrasound, is crucial in the diagnosis. We present a case of a torsed wandering spleen which was treated with splenectomy.

Keywords: Wandering Spleen, Torsion, Splenic Torsion, Spleen

Procedia PDF Downloads 75
1644 Comparative Analysis of Costs and Well Drilling Techniques for Water, Geothermal Energy, Oil and Gas Production

Authors: Thales Maluf, Nazem Nascimento

Abstract:

The development of society relies heavily on the total amount of energy obtained and its consumption. Over the years, there has been an advancement on energy attainment, which is directly related to some natural resources and developing systems. Some of these resources should be highlighted for its remarkable presence in world´s energy grid, such as water, petroleum, and gas, while others deserve attention for representing an alternative to diversify the energy grid, like geothermal sources. Therefore, because all these resources can be extracted from the underground, drilling wells is a mandatory activity in terms of exploration, and it involves a previous geological study and an adequate preparation. It also involves a cleaning process and an extraction process that can be executed by different procedures. For that reason, this research aims the enhancement of exploration processes through a comparative analysis of drilling costs and techniques used to produce them. The analysis itself is based on a bibliographical review based on books, scientific papers, schoolwork and mainly explore drilling methods and technologies, equipment used, well measurements, extraction methods, and production costs. Besides techniques and costs regarding the drilling processes, some properties and general characteristics of these sources are also compared. Preliminary studies show that there are some major differences regarding the exploration processes, mostly because these resources are naturally distinct. Water wells, for instance, have hundreds of meters of length because water is stored close to the surface, while oil, gas, and geothermal production wells can reach thousands of meters, which make them more expensive to be drilled. The drilling methods present some general similarities especially regarding the main mechanism of perforation, but since water is a resource stored closer to the surface than the other ones, there is a wider variety of methods. Water wells can be drilled by rotary mechanisms, percussion mechanisms, rotary-percussion mechanisms, and some other simpler methods. Oil and gas production wells, on the other hand, require rotary or rotary-percussion drilling with a proper structure called drill rig and resistant materials for the drill bits and the other components, mostly because they´re stored in sedimentary basins that can be located thousands of meters under the ground. Geothermal production wells also require rotary or rotary-percussion drilling and require the existence of an injection well and an extraction well. The exploration efficiency also depends on the permeability of the soil, and that is why it has been developed the Enhanced Geothermal Systems (EGS). Throughout this review study, it can be verified that the analysis of the extraction processes of energy resources is essential since these resources are responsible for society development. Furthermore, the comparative analysis of costs and well drilling techniques for water, geothermal energy, oil, and gas production, which is the main goal of this research, can enable the growth of energy generation field through the emergence of ideas that improve the efficiency of energy generation processes.

Keywords: drilling, water, oil, Gas, geothermal energy

Procedia PDF Downloads 136
1643 Factors Affecting Aluminum Dissolve from Acidified Water Purification Sludge

Authors: Wen Po Cheng, Chi Hua Fu, Ping Hung Chen, Ruey Fang Yu

Abstract:

Recovering resources from water purification sludge (WPS) have been gradually stipulated in environmental protection laws and regulations in many nations. Hence, reusing the WPS is becoming an important topic, and recovering alum from WPS is one of the many practical alternatives. Most previous research efforts have been conducted on studying the amphoteric characteristic of aluminum hydroxide for investigating the optimum pH range to dissolve the Al(III) species from WPS, but it has been lack of reaction kinetics or mechanisms related discussion. Therefore, in this investigation, water purification sludge (WPS) solution was broken by ultrasound to make particle size of reactants smaller, specific surface area larger. According to the reaction kinetics, these phenomena let the dissolved aluminum salt quantity increased and the reaction rate go faster.

Keywords: aluminum, acidification, sludge, recovery

Procedia PDF Downloads 619
1642 Ultrasonic Densitometry of Alveolar Bone Jaw during Retention Period of Orthodontic Treatment

Authors: Margarita A. Belousova, Sergey N. Ermoliev, Nina K. Loginova

Abstract:

The method of intraoral ultrasound densitometry developed to diagnose mineral density of alveolar bone jaws during retention period of orthodontic treatment (Patent of Russian Federation № 2541038). It was revealed significant decrease of the ultrasonic wave speed and bone mineral density in patients with relapses dentition anomalies during retention period of orthodontic treatment.

Keywords: intraoral ultrasonic densitometry, speed of sound, alveolar jaw bone, relapses of dentition anomalies, retention period of orthodontic treatment

Procedia PDF Downloads 376
1641 Extraction and Antibacterial Studies of Oil from Three Mango Kernel Obtained from Makurdi, Nigeria

Authors: K. Asemave, D. O. Abakpa, T. T. Ligom

Abstract:

The ability of bacteria to develop resistance to many antibiotics cannot be undermined, given the multifaceted health challenges in the present times. For this reason, a lot of attention is on botanicals and their products in search of new antibacterial agents. On the other hand, mango kernel oils (MKO) can be heavily valorized by taking advantage of the myriads bioactive phytochemicals it contains. Herein, we validated the use of MKO as bioactive agent against bacteria. The MKOs for the study were extracted by soxhlet means with ethanol and hexane for 4 h from 3 different mango kernels, namely; 'local' (sample A), 'julie' (sample B), and 'john' (sample C). Prior to the extraction, ground fine particles of the kernels were obtained from the seed kernels dried in oven at 100 °C for 8 h. Hexane gave higher yield of the oils than ethanol. It was also qualitatively confirmed that the mango kernel oils contain some phytochemicals such as phenol, quinone, saponin, and terpenoid. The results of the antibacterial activities of the MKO against both gram positive (Staphylococcus aureus) and gram negative (Pseudomonas aeruginosa) at different concentrations showed that the oils extracted with ethanol gave better antibacterial properties than those of the hexane. More so, the bioactivities were best with the local mango kernel oil. Indeed this work has completely validated the previous claim that MKOs are effective antibacterial agents. Thus, these oils (especially the ethanol-derived ones) can be used as bacteriostatic and antibacterial agents in say food, cosmetics, and allied industries.

Keywords: bacteria, mango, kernel, oil, phytochemicals

Procedia PDF Downloads 145
1640 Extracting Therapeutic Grade Essential Oils from the Lamiaceae Plant Family in the United Arab Emirates (UAE): Highlights on Great Possibilities and Sever Difficulties

Authors: Suzan M. Shahin, Mohammed A. Salem

Abstract:

Essential oils are expensive phytochemicals produced and extracted from specific species belonging to particular families in the plant kingdom. In the United Arab Emirates country (UAE), which is located in the arid region of the world, nine species, from the Lamiaceae family, having the capability to produce therapeutic grade essential oils. These species include; Mentha spicata, Ocimum forskolei, Salvia macrosiphon, Salvia aegyptiaca, Salvia macilenta, Salvia spinosa, Teucrium polium, Teucrium stocksianum, and Zataria multiflora. Although, such potential species are indigenous to the UAE, however, there are almost no studies available to investigate the chemical composition and the quality of the extracted essential oils under the UAE climatological conditions. Therefore, great attention has to be given to such valuable natural resources, through conducting highly supported research projects, tailored to the UAE conditions, and investigating different extraction techniques, including the application of the latest available technologies, such as superficial fluid CO2. This is crucially needed; in order to accomplish the greatest possibilities in the medicinal field, specifically in the discovery of new therapeutic chemotypes, as well as, to achieve the sustainability of this natural resource in the country.

Keywords: essential oils, extraction techniques, Lamiaceae, traditional medicine, United Arab Emirates (UAE)

Procedia PDF Downloads 455
1639 Is there Anything Useful in That? High Value Product Extraction from Artemisia annua L. in the Spent Leaf and Waste Streams

Authors: Anike Akinrinlade

Abstract:

The world population is estimated to grow from 7.1 billion to 9.22 billion by 2075, increasing therefore by 23% from the current global population. Much of the demographic changes up to 2075 will take place in the less developed regions. There are currently 54 countries which fall under the bracket of being defined as having ‘low-middle income’ economies and need new ways to generate valuable products from current resources that is available. Artemisia annua L is well used for the extraction of the phytochemical artemisinin, which accounts for around 0.01 to 1.4 % dry weight of the plant. Artemisinin is used in the treatment of malaria, a disease rampart in sub-Saharan Africa and in many other countries. Once artemisinin has been extracted the spent leaf and waste streams are disposed of as waste. A feasibility study was carried out looking at increasing the biomass value of A. annua, by designing a biorefinery where spent leaf and waste streams are utilized for high product generation. Quercetin, ferulic acid, dihydroartemisinic acid, artemisinic acid and artemsinin were screened for in the waste stream samples and the spent leaf. The analytical results showed that artemisinin, artemisinic acid and dihydroartemisinic acid were present in the waste extracts as well as camphor and arteannuin b. Ongoing effects are looking at using more industrially relevant solvents to extract the phytochemicals from the waste fractions and investigate how microwave pyrolysis of spent leaf can be utilized to generate bio-products.

Keywords: high value product generation, bioinformatics, biomedicine, waste streams, spent leaf

Procedia PDF Downloads 339
1638 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients

Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho

Abstract:

Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).

Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper

Procedia PDF Downloads 139
1637 Image Processing techniques for Surveillance in Outdoor Environment

Authors: Jayanth C., Anirudh Sai Yetikuri, Kavitha S. N.

Abstract:

This paper explores the development and application of computer vision and machine learning techniques for real-time pose detection, facial recognition, and number plate extraction. Utilizing MediaPipe for pose estimation, the research presents methods for detecting hand raises and ducking postures through real-time video analysis. Complementarily, facial recognition is employed to compare and verify individual identities using the face recognition library. Additionally, the paper demonstrates a robust approach for extracting and storing vehicle number plates from images, integrating Optical Character Recognition (OCR) with a database management system. The study highlights the effectiveness and versatility of these technologies in practical scenarios, including security and surveillance applications. The findings underscore the potential of combining computer vision techniques to address diverse challenges and enhance automated systems for both individual and vehicular identification. This research contributes to the fields of computer vision and machine learning by providing scalable solutions and demonstrating their applicability in real-world contexts.

Keywords: computer vision, pose detection, facial recognition, number plate extraction, machine learning, real-time analysis, OCR, database management

Procedia PDF Downloads 4
1636 Recovery of Copper and Gold by Delamination of Printed Circuit Boards Followed by Leaching and Solvent Extraction Process

Authors: Kamalesh Kumar Singh

Abstract:

Due to increasing trends of electronic waste, specially the ICT related gadgets, their green recycling is still a greater challenge. This article presents a two-stage, eco-friendly hydrometallurgical route for the recovery of gold from the delaminated metallic layers of waste mobile phone Printed Circuit Boards (PCBs). Initially, mobile phone PCBs are downsized (1x1 cm²) and treated with an organic solvent dimethylacetamide (DMA) for the separation of metallic fraction from non-metallic glass fiber. In the first stage, liberated metallic sheets are used for the selective dissolution of copper in an aqueous leaching reagent. Influence of various parameters such as type of leaching reagent, the concentration of the solution, temperature, time and pulp density are optimized for the effective leaching (almost 100%) of copper. Results have shown that 3M nitric acid is a suitable reagent for copper leaching at room temperature and considering chemical features, gold remained in solid residue. In the second stage, the separated residue is used for the recovery of gold by using sulphuric acid with a combination of halide salt. In this halide leaching, Cl₂ or Br₂ is generated as an in-situ oxidant to improve the leaching of gold. Results have shown that almost 92 % of gold is recovered at the optimized parameters.

Keywords: printed circuit boards, delamination, leaching, solvent extraction, recovery

Procedia PDF Downloads 48
1635 Microfluidic Continuous Approaches to Produce Magnetic Nanoparticles with Homogeneous Size Distribution

Authors: Ane Larrea, Victor Sebastian, Manuel Arruebo, Jesus Santamaria

Abstract:

We present a gas-liquid microfluidic system as a reactor to obtain magnetite nanoparticles with an excellent degree of control regarding their crystalline phase, shape and size. Several types of microflow approaches were selected to prevent nanomaterial aggregation and to promote homogenous size distribution. The selected reactor consists of a mixer stage aided by ultrasound waves and a reaction stage using a N2-liquid segmented flow to prevent magnetite oxidation to non-magnetic phases. A milli-fluidic reactor was developed to increase the production rate where a magnetite throughput close to 450 mg/h in a continuous fashion was obtained.

Keywords: continuous production, magnetic nanoparticles, microfluidics, nanomaterials

Procedia PDF Downloads 583
1634 Feasibility of Chicken Feather Waste as a Renewable Resource for Textile Dyeing Processes

Authors: Belayihun Missaw

Abstract:

Cotton cationization is an emerging area that solves the environmental problems associated with the reactive dyeing of cotton. In this study, keratin hydrolysate cationizing agent from chicken feather was extracted and optimized to eliminate the usage of salt during dyeing. Cationization of cotton using the extracted keratin hydrolysate and dyeing of the cationized cotton without salt was made. The effect of extraction parametric conditions like concentration of caustic soda, temperature and time were studied on the yield of protein from chicken feather and colour strength (K/S) values, and these process conditions were optimized. The optimum extraction conditions were. 25g/l caustic soda, at 500C temperature and 105 minutes with average yield = 91.2% and 4.32 colour strength value. The effect of salt addition, pH and concentration of cationizing agent on yield colour strength was also studied and optimized. It was observed that slightly acidic condition with 4% (% owf) concentration of cationizing agent gives a better dyeability as compared to normal cotton reactive dyeing. The physical properties of cationized-dyed fabric were assessed, and the result reveals that the cationization has a similar effect as normal dyeing of cotton. The cationization of cotton with keratin extract was found to be successful and economically viable.

Keywords: cotton materials, cationization, reactive dye, keratin hydrolysate

Procedia PDF Downloads 54
1633 An Automated Optimal Robotic Assembly Sequence Planning Using Artificial Bee Colony Algorithm

Authors: Balamurali Gunji, B. B. V. L. Deepak, B. B. Biswal, Amrutha Rout, Golak Bihari Mohanta

Abstract:

Robots play an important role in the operations like pick and place, assembly, spot welding and much more in manufacturing industries. Out of those, assembly is a very important process in manufacturing, where 20% of manufacturing cost is wholly occupied by the assembly process. To do the assembly task effectively, Assembly Sequences Planning (ASP) is required. ASP is one of the multi-objective non-deterministic optimization problems, achieving the optimal assembly sequence involves huge search space and highly complex in nature. Many researchers have followed different algorithms to solve ASP problem, which they have several limitations like the local optimal solution, huge search space, and execution time is more, complexity in applying the algorithm, etc. By keeping the above limitations in mind, in this paper, a new automated optimal robotic assembly sequence planning using Artificial Bee Colony (ABC) Algorithm is proposed. In this algorithm, automatic extraction of assembly predicates is done using Computer Aided Design (CAD) interface instead of extracting the assembly predicates manually. Due to this, the time of extraction of assembly predicates to obtain the feasible assembly sequence is reduced. The fitness evaluation of the obtained feasible sequence is carried out using ABC algorithm to generate the optimal assembly sequence. The proposed methodology is applied to different industrial products and compared the results with past literature.

Keywords: assembly sequence planning, CAD, artificial Bee colony algorithm, assembly predicates

Procedia PDF Downloads 233
1632 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles

Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang

Abstract:

With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.

Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering

Procedia PDF Downloads 126
1631 Single-Crystal Kerfless 2D Array Transducer for Volumetric Medical Imaging: Theoretical Study

Authors: Jurij Tasinkiewicz

Abstract:

The aim of this work is to present a theoretical analysis of a 2D ultrasound transducer comprised of crossed arrays of metal strips placed on both sides of thin piezoelectric layer (a). Such a structure is capable of electronic beam-steering of generated wave beam both in elevation and azimuth. In this paper, a semi-analytical model of the considered transducer is developed. It is based on generalization of the well-known BIS-expansion method. Specifically, applying the electrostatic approximation, the electric field components on the surface of the layer are expanded into fast converging series of double periodic spatial harmonics with corresponding amplitudes represented by the properly chosen Legendre polynomials. The problem is reduced to numerical solving of certain system of linear equations for unknown expansion coefficients.

Keywords: beamforming, transducer array, BIS-expansion, piezoelectric layer

Procedia PDF Downloads 419
1630 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System

Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa

Abstract:

Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.

Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)

Procedia PDF Downloads 305
1629 Estimation of Forces Applied to Forearm Using EMG Signal Features to Control of Powered Human Arm Prostheses

Authors: Faruk Ortes, Derya Karabulut, Yunus Ziya Arslan

Abstract:

Myoelectric features gathering from musculature environment are considered on a preferential basis to perceive muscle activation and control human arm prostheses according to recent experimental researches. EMG (electromyography) signal based human arm prostheses have shown a promising performance in terms of providing basic functional requirements of motions for the amputated people in recent years. However, these assistive devices for neurorehabilitation still have important limitations in enabling amputated people to perform rather sophisticated or functional movements. Surface electromyogram (EMG) is used as the control signal to command such devices. This kind of control consists of activating a motion in prosthetic arm using muscle activation for the same particular motion. Extraction of clear and certain neural information from EMG signals plays a major role especially in fine control of hand prosthesis movements. Many signal processing methods have been utilized for feature extraction from EMG signals. The specific objective of this study was to compare widely used time domain features of EMG signal including integrated EMG(IEMG), root mean square (RMS) and waveform length(WL) for prediction of externally applied forces to human hands. Obtained features were classified using artificial neural networks (ANN) to predict the forces. EMG signals supplied to process were recorded during only type of muscle contraction which is isometric and isotonic one. Experiments were performed by three healthy subjects who are right-handed and in a range of 25-35 year-old aging. EMG signals were collected from muscles of the proximal part of the upper body consisting of: biceps brachii, triceps brachii, pectorialis major and trapezius. The force prediction results obtained from the ANN were statistically analyzed and merits and pitfalls of the extracted features were discussed with detail. The obtained results are anticipated to contribute classification process of EMG signal and motion control of powered human arm prosthetics control.

Keywords: assistive devices for neurorehabilitation, electromyography, feature extraction, force estimation, human arm prosthesis

Procedia PDF Downloads 362
1628 Introduction of Artificial Intelligence for Estimating Fractal Dimension and Its Applications in the Medical Field

Authors: Zerroug Abdelhamid, Danielle Chassoux

Abstract:

Various models are given to simulate homogeneous or heterogeneous cancerous tumors and extract in each case the boundary. The fractal dimension is then estimated by least squares method and compared to some previous methods.

Keywords: simulation, cancerous tumor, Markov fields, fractal dimension, extraction, recovering

Procedia PDF Downloads 359
1627 To Study the Effect of Drying Temperature Towards Extraction of Aquilaria subintegra Dry Leaves Using Vacuum Far Infrared

Authors: Tengku Muhammad Rafi Nazmi Bin Tengku Razali, Habsah Alwi

Abstract:

This article based on effect of temperature towards extraction of Aquilaria Subintegra. Aquilaria Subintegra which its main habitat is in Asia-tropical and particularly often found in its native which is Thailand. There is claim which is Aquilaria Subintegra contains antipyretic properties that helps fight fever. Research nowadays also shown that paracetamol consumed bring bad effect towards consumers. This sample will first dry using Vacuum Far Infrared which provides better drying than conventional oven. Soxhlet extractor used to extract oil from sample. Gas Chromatography Mass Spectrometer used to analyze sample to determine its compound. Objective from this research was to determine the active ingredients that exist in the Aquilaria Subintegra leaves and to determine whether compound of Acetaminophen exist or not inside the leaves. Moisture content from 400C was 80%, 500C was 620% and 600C was 36%. The greater temperature resulting lower moisture content inside sample leaves. 7 components were identified in sample T=400C while only 5 components were identified in sample at T=50C and T=60C. Four components were commonly identified in three sample which is 1n-Hexadecanoic acid, 9,12,15-Octadecatrienoic acid, methyl ester (z,z,z), Vitamin E and Squalene. Further studies are needed with new series of temperature to refine the best results.

Keywords: aquilaria subintegra, vacuum far infrared, SOXHLET extractor, gas chromatography mass spectrometer, paracetamol

Procedia PDF Downloads 479
1626 Sequential and Combinatorial Pre-Treatment Strategy of Lignocellulose for the Enhanced Enzymatic Hydrolysis of Spent Coffee Waste

Authors: Rajeev Ravindran, Amit K. Jaiswal

Abstract:

Waste from the food-processing industry is produced in large amount and contains high levels of lignocellulose. Due to continuous accumulation throughout the year in large quantities, it creates a major environmental problem worldwide. The chemical composition of these wastes (up to 75% of its composition is contributed by polysaccharide) makes it inexpensive raw material for the production of value-added products such as biofuel, bio-solvents, nanocrystalline cellulose and enzymes. In order to use lignocellulose as the raw material for the microbial fermentation, the substrate is subjected to enzymatic treatment, which leads to the release of reducing sugars such as glucose and xylose. However, the inherent properties of lignocellulose such as presence of lignin, pectin, acetyl groups and the presence of crystalline cellulose contribute to recalcitrance. This leads to poor sugar yields upon enzymatic hydrolysis of lignocellulose. A pre-treatment method is generally applied before enzymatic treatment of lignocellulose that essentially removes recalcitrant components in biomass through structural breakdown. Present study is carried out to find out the best pre-treatment method for the maximum liberation of reducing sugars from spent coffee waste (SPW). SPW was subjected to a range of physical, chemical and physico-chemical pre-treatment followed by a sequential, combinatorial pre-treatment strategy is also applied on to attain maximum sugar yield by combining two or more pre-treatments. All the pre-treated samples were analysed for total reducing sugar followed by identification and quantification of individual sugar by HPLC coupled with RI detector. Besides, generation of any inhibitory compounds such furfural, hydroxymethyl furfural (HMF) which can hinder microbial growth and enzyme activity is also monitored. Results showed that ultrasound treatment (31.06 mg/L) proved to be the best pre-treatment method based on total reducing content followed by dilute acid hydrolysis (10.03 mg/L) while galactose was found to be the major monosaccharide present in the pre-treated SPW. Finally, the results obtained from the study were used to design a sequential lignocellulose pre-treatment protocol to decrease the formation of enzyme inhibitors and increase sugar yield on enzymatic hydrolysis by employing cellulase-hemicellulase consortium. Sequential, combinatorial treatment was found better in terms of total reducing yield and low content of the inhibitory compounds formation, which could be due to the fact that this mode of pre-treatment combines several mild treatment methods rather than formulating a single one. It eliminates the need for a detoxification step and potential application in the valorisation of lignocellulosic food waste.

Keywords: lignocellulose, enzymatic hydrolysis, pre-treatment, ultrasound

Procedia PDF Downloads 360
1625 Ultrasound Assisted Alkaline Potassium Permanganate Pre-Treatment of Spent Coffee Waste

Authors: Rajeev Ravindran, Amit K. Jaiswal

Abstract:

Lignocellulose is the largest reservoir of inexpensive, renewable source of carbon. It is composed of lignin, cellulose and hemicellulose. Cellulose and hemicellulose is composed of reducing sugars glucose, xylose and several other monosaccharides which can be metabolised by microorganisms to produce several value added products such as biofuels, enzymes, aminoacids etc. Enzymatic treatment of lignocellulose leads to the release of monosaccharides such as glucose and xylose. However, factors such as the presence of lignin, crystalline cellulose, acetyl groups, pectin etc. contributes to recalcitrance restricting the effective enzymatic hydrolysis of cellulose and hemicellulose. In order to overcome these problems, pre-treatment of lignocellulose is generally carried out which essentially facilitate better degradation of lignocellulose. A range of pre-treatment strategy is commonly employed based on its mode of action viz. physical, chemical, biological and physico-chemical. However, existing pretreatment strategies result in lower sugar yield and formation of inhibitory compounds. In order to overcome these problems, we proposes a novel pre-treatment, which utilises the superior oxidising capacity of alkaline potassium permanganate assisted by ultra-sonication to break the covalent bonds in spent coffee waste to remove recalcitrant compounds such as lignin. The pre-treatment was conducted for 30 minutes using 2% (w/v) potassium permanganate at room temperature with solid to liquid ratio of 1:10. The pre-treated spent coffee waste (SCW) was subjected to enzymatic hydrolysis using enzymes cellulase and hemicellulase. Shake flask experiments were conducted with a working volume of 50mL buffer containing 1% substrate. The results showed that the novel pre-treatment strategy yielded 7 g/L of reducing sugar as compared to 3.71 g/L obtained from biomass that had undergone dilute acid hydrolysis after 24 hours. From the results obtained it is fairly certain that ultrasonication assists the oxidation of recalcitrant components in lignocellulose by potassium permanganate. Enzyme hydrolysis studies suggest that ultrasound assisted alkaline potassium permanganate pre-treatment is far superior over treatment by dilute acid. Furthermore, SEM, XRD and FTIR were carried out to analyse the effect of the new pre-treatment strategy on structure and crystallinity of pre-treated spent coffee wastes. This novel one-step pre-treatment strategy was implemented under mild conditions and exhibited high efficiency in the enzymatic hydrolysis of spent coffee waste. Further study and scale up is in progress in order to realise future industrial applications.

Keywords: spent coffee waste, alkaline potassium permanganate, ultra-sonication, physical characterisation

Procedia PDF Downloads 346
1624 BIM-Based Tool for Sustainability Assessment and Certification Documents Provision

Authors: Taki Eddine Seghier, Mohd Hamdan Ahmad, Yaik-Wah Lim, Samuel Opeyemi Williams

Abstract:

The assessment of building sustainability to achieve a specific green benchmark and the preparation of the required documents in order to receive a green building certification, both are considered as major challenging tasks for green building design team. However, this labor and time-consuming process can take advantage of the available Building Information Modeling (BIM) features such as material take-off and scheduling. Furthermore, the workflow can be automated in order to track potentially achievable credit points and provide rating feedback for several design options by using integrated Visual Programing (VP) to handle the stored parameters within the BIM model. Hence, this study proposes a BIM-based tool that uses Green Building Index (GBI) rating system requirements as a unique input case to evaluate the building sustainability in the design stage of the building project life cycle. The tool covers two key models for data extraction, firstly, a model for data extraction, calculation and the classification of achievable credit points in a green template, secondly, a model for the generation of the required documents for green building certification. The tool was validated on a BIM model of residential building and it serves as proof of concept that building sustainability assessment of GBI certification can be automatically evaluated and documented through BIM.

Keywords: green building rating system, GBRS, building information modeling, BIM, visual programming, VP, sustainability assessment

Procedia PDF Downloads 322
1623 Large-Capacity Image Information Reduction Based on Single-Cue Saliency Map for Retinal Prosthesis System

Authors: Yili Chen, Xiaokun Liang, Zhicheng Zhang, Yaoqin Xie

Abstract:

In an effort to restore visual perception in retinal diseases, an electronic retinal prosthesis with thousands of electrodes has been developed. The image processing strategies of retinal prosthesis system converts the original images from the camera to the stimulus pattern which can be interpreted by the brain. Practically, the original images are with more high resolution (256x256) than that of the stimulus pattern (such as 25x25), which causes a technical image processing challenge to do large-capacity image information reduction. In this paper, we focus on developing an efficient image processing stimulus pattern extraction algorithm by using a single cue saliency map for extracting salient objects in the image with an optimal trimming threshold. Experimental results showed that the proposed stimulus pattern extraction algorithm performs quite well for different scenes in terms of the stimulus pattern. In the algorithm performance experiment, our proposed SCSPE algorithm have almost five times of the score compared with Boyle’s algorithm. Through experiment s we suggested that when there are salient objects in the scene (such as the blind meet people or talking with people), the trimming threshold should be set around 0.4max, in other situations, the trimming threshold values can be set between 0.2max-0.4max to give the satisfied stimulus pattern.

Keywords: retinal prosthesis, image processing, region of interest, saliency map, trimming threshold selection

Procedia PDF Downloads 241
1622 Fishing Waste: A Source of Valuable Products through Anaerobic Treatments

Authors: Luisa Maria Arrechea Fajardo, Luz Stella Cadavid Rodriguez

Abstract:

Fish is one of the most commercialized foods worldwide. However, this industry only takes advantage of about 55% of the product's weight, the rest is converted into waste, which is mainly composed of viscera, gills, scales and spines. Consequently, if these wastes are not used or disposed of properly, they cause serious environmental impacts. This is the case of Tumaco (Colombia), the second largest producer of marine fisheries on the Colombian Pacific coast, where artisanal fishermen process more than 50% of the commercialized volume. There, fishing waste is disposed primarily in the ocean, causing negative impacts on the environment and society. Therefore, in the present research, a proposal was made to take advantage of fishing waste through anaerobic treatments, through which it is possible to obtain products with high added value from organic waste. The research was carried out in four stages. First, the production of volatile fatty acids (VFA) in semi-continuous 4L reactors was studied, evaluating three hydraulic retention times (HRT) (10, 7 and 5 days) with four organic loading rates (OLR) (16, 14, 12 and 10 gVS/L/day), the experiment was carried out for 150 days. Subsequently, biogas production was evaluated from the solid digestate generated in the VFA production reactors, initially evaluating the biochemical methane potential (BMP) of 4 total solid concentrations (1, 2, 4 and 6% TS), for 40 days and then, with the optimum TS concentration (2 gVS/L/day), 2 HRT (15 and 20 days) in semi-continuous reactors, were evaluated for 100 days. Finally, the integration of the processes was carried out with the best conditions found, a first phase of VFA production from fishing waste and a second phase of biogas production from unrecovered VFAs and unprocessed material Additionally, an VFA membrane extraction system was included. In the first phase, a liquid digestate with a concentration and VFA production yield of 59.04 gVFA/L and 0.527 gVFA/gVS, respectively, was obtained, with the best condition found (HRT:7 days and OLR: 16 gVS/L/día), where acetic acid and isobutyric acid were the predominant acids. In the second phase of biogas production, a BMP of 0.349 Nm3CH4/KgVS was reached, and it was found as best HRT 20 days. In the integration, the isovaleric, butyric and isobutyric acid were the VFA with the highest percentage of extraction, additionally a 106.67% increase in biogas production was achieved. This research shows that anaerobic treatments are a promising technology for an environmentally safe management of fishing waste and presents the basis of a possible biorefinery.

Keywords: biogas production, fishing waste, VFA membrane extraction, VFA production

Procedia PDF Downloads 112
1621 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing

Procedia PDF Downloads 182
1620 Low Frequency Ultrasonic Degassing to Reduce Void Formation in Epoxy Resin and Its Effect on the Thermo-Mechanical Properties of the Cured Polymer

Authors: A. J. Cobley, L. Krishnan

Abstract:

The demand for multi-functional lightweight materials in sectors such as automotive, aerospace, electronics is growing, and for this reason fibre-reinforced, epoxy polymer composites are being widely utilized. The fibre reinforcing material is mainly responsible for the strength and stiffness of the composites whilst the main role of the epoxy polymer matrix is to enhance the load distribution applied on the fibres as well as to protect the fibres from the effect of harmful environmental conditions. The superior properties of the fibre-reinforced composites are achieved by the best properties of both of the constituents. Although factors such as the chemical nature of the epoxy and how it is cured will have a strong influence on the properties of the epoxy matrix, the method of mixing and degassing of the resin can also have a significant impact. The production of a fibre-reinforced epoxy polymer composite will usually begin with the mixing of the epoxy pre-polymer with a hardener and accelerator. Mechanical methods of mixing are often employed for this stage but such processes naturally introduce air into the mixture, which, if it becomes entrapped, will lead to voids in the subsequent cured polymer. Therefore, degassing is normally utilised after mixing and this is often achieved by placing the epoxy resin mixture in a vacuum chamber. Although this is reasonably effective, it is another process stage and if a method of mixing could be found that, at the same time, degassed the resin mixture this would lead to shorter production times, more effective degassing and less voids in the final polymer. In this study the effect of four different methods for mixing and degassing of the pre-polymer with hardener and accelerator were investigated. The first two methods were manual stirring and magnetic stirring which were both followed by vacuum degassing. The other two techniques were ultrasonic mixing/degassing using a 40 kHz ultrasonic bath and a 20 kHz ultrasonic probe. The cured cast resin samples were examined under scanning electron microscope (SEM), optical microscope, and Image J analysis software to study morphological changes, void content and void distribution. Three point bending test and differential scanning calorimetry (DSC) were also performed to determine the thermal and mechanical properties of the cured resin. It was found that the use of the 20 kHz ultrasonic probe for mixing/degassing gave the lowest percentage voids of all the mixing methods in the study. In addition, the percentage voids found when employing a 40 kHz ultrasonic bath to mix/degas the epoxy polymer mixture was only slightly higher than when magnetic stirrer mixing followed by vacuum degassing was utilized. The effect of ultrasonic mixing/degassing on the thermal and mechanical properties of the cured resin will also be reported. The results suggest that low frequency ultrasound is an effective means of mixing/degassing a pre-polymer mixture and could enable a significant reduction in production times.

Keywords: degassing, low frequency ultrasound, polymer composites, voids

Procedia PDF Downloads 294
1619 Numerical Investigation of Nanofluid Based Thermosyphon System

Authors: Kiran Kumar K., Ramesh Babu Bejjam, Atul Najan

Abstract:

A thermosyphon system is a heat transfer loop which operates on the basis of gravity and buoyancy forces. It guarantees a good reliability and low maintenance cost as it does not involve any mechanical pump. Therefore it can be used in many industrial applications such as refrigeration and air conditioning, electronic cooling, nuclear reactors, geothermal heat extraction, etc. But flow instabilities and loop configuration are the major problems in this system. Several previous researchers studied that stabilities can be suppressed by using nanofluids as loop fluid. In the present study a rectangular thermosyphon loop with end heat exchangers are considered for the study. This configuration is more appropriate for many practical applications such as solar water heater, geothermal heat extraction, etc. In the present work, steady-state analysis is carried out on thermosyphon loop with parallel flow coaxial heat exchangers at heat source and heat sink. In this loop nano fluid is considered as the loop fluid and water is considered as the external fluid in both hot and cold heat exchangers. For this analysis one-dimensional homogeneous model is developed. In this model, conservation equations like conservation of mass, momentum, energy are discretized using finite difference method. A computer code is written in MATLAB to simulate the flow in thermosyphon loop. A comparison in terms of heat transfer is made between water and nano fluid as working fluids in the loop.

Keywords: heat exchanger, heat transfer, nanofluid, thermosyphon loop

Procedia PDF Downloads 472
1618 High Performance Liquid Cooling Garment (LCG) Using ThermoCore

Authors: Venkat Kamavaram, Ravi Pare

Abstract:

Modern warfighters experience extreme environmental conditions in many of their operational and training activities. In temperatures exceeding 95°F, the body’s temperature regulation can no longer cool through convection and radiation. In this case, the only cooling mechanism is evaporation. However, evaporative cooling is often compromised by excessive humidity. Natural cooling mechanisms can be further compromised by clothing and protective gear, which trap hot air and moisture close to the body. Creating an efficient heat extraction apparel system that is also lightweight without hindering dexterity or mobility of personnel working in extreme temperatures is a difficult technical challenge and one that needs to be addressed to increase the probability for the future success of the US military. To address this challenge, Oceanit Laboratories, Inc. has developed and patented a Liquid Cooled Garment (LCG) more effective than any on the market today. Oceanit’s LCG is a form-fitting garment with a network of thermally conductive tubes that extracts body heat and can be worn under all authorized and chemical/biological protective clothing. Oceanit specifically designed and developed ThermoCore®, a thermally conductive polymer, for use in this apparel, optimizing the product for thermal conductivity, mechanical properties, manufacturability, and performance temperatures. Thermal Manikin tests were conducted in accordance with the ASTM test method, ASTM F2371, Standard Test Method for Measuring the Heat Removal Rate of Personal Cooling Systems Using a Sweating Heated Manikin, in an environmental chamber using a 20-zone sweating thermal manikin. Manikin test results have shown that Oceanit’s LCG provides significantly higher heat extraction under the same environmental conditions than the currently fielded Environmental Control Vest (ECV) while at the same time reducing the weight. Oceanit’s LCG vests performed nearly 30% better in extracting body heat while weighing 15% less than the ECV. There are NO cooling garments in the market that provide the same thermal extraction performance, form-factor, and reduced weight as Oceanit’s LCG. The two cooling garments that are commercially available and most commonly used are the Environmental Control Vest (ECV) and the Microclimate Cooling Garment (MCG).

Keywords: thermally conductive composite, tubing, garment design, form fitting vest, thermocore

Procedia PDF Downloads 108
1617 Simulations of Cryogenic Cavitation of Low Temperature Fluids with Thermodynamics Effects

Authors: A. Alhelfi, B. Sunden

Abstract:

Cavitation in cryogenic liquids is widely present in contemporary science. In the current study, we re-examine a previously validated acoustic cavitation model which was developed for a gas bubble in liquid water. Furthermore, simulations of cryogenic fluids including the thermal effect, the effect of acoustic pressure amplitude and the frequency of sound field on the bubble dynamics are presented. A gas bubble (Helium) in liquids Nitrogen, Oxygen and Hydrogen in an acoustic field at ambient pressure and low temperature is investigated numerically. The results reveal that the oscillation of the bubble in liquid Hydrogen fluctuates more than in liquids Oxygen and Nitrogen. The oscillation of the bubble in liquids Oxygen and Nitrogen is approximately similar.

Keywords: cryogenic liquids, cavitation, rocket engineering, ultrasound

Procedia PDF Downloads 318
1616 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 121