Search results for: features extraction
4476 Some Imaginative Geomorphosites in Malaysia: Study on Their Formations and Geotourism Potentials
Authors: Dony Adriansyah Nazaruddin, Mohammad Muqtada Ali Khan
Abstract:
This paper aims to present some imaginative geomorphological sites in Malaysia. This study comprises desk study and field study. Desk study was conducted by reviewing some literatures related to the topic and some geomorphosites in Malaysia. Field study was organized in 2013 and 2014 to investigate the recent situation of these sites and to take some measurements, photographs and rock samples. Some examples of imaginative geomorphosites all over Malaysia have been identified for this purpose. In Peninsular Malaysia, some geomorphosites in Langkawi Islands (the state of Kedah) have imaginative features such as a “turtle” atop the limestone hill of Setul Formation at the Kilim Geoforest Park, a “shoe” at the Kasut island of the Kilim Geoforest Park, a “lying pregnant lady” at the Dayang Bunting island of the Dayang Bunting Marble Geoforest Park, and a “ship” of the Singa Kecil island. Meanwhile, some other examples are from the state of Kelantan, such as a mogote hill with a “human face looking upward” at Gunung Reng, Jeli District and a “boat rock” at Mount Chamah, Gua Musang District. In East Malaysia, there is only one example can be identified, it is the “Abraham Lincoln’s face” at the Deer Cave, Gunung Mulu National Park, Sarawak. Karst landforms dominate the imaginative geomorphosites in Malaysia. The formations of these features are affected by some endogenic and exogenic processes, such as tectonic uplift, weathering (including solution), erosion, and so on. This study will recommend that these imaginative features should be conserved and developed for some purposes, such as research, education, and geotourism development in Malaysia.Keywords: geomorphosite, geotourism, earth processes, karst landforms, Malaysia
Procedia PDF Downloads 6274475 Best-Performing Color Space for Land-Sea Segmentation Using Wavelet Transform Color-Texture Features and Fusion of over Segmentation
Authors: Seynabou Toure, Oumar Diop, Kidiyo Kpalma, Amadou S. Maiga
Abstract:
Color and texture are the two most determinant elements for perception and recognition of the objects in an image. For this reason, color and texture analysis find a large field of application, for example in image classification and segmentation. But, the pioneering work in texture analysis was conducted on grayscale images, thus discarding color information. Many grey-level texture descriptors have been proposed and successfully used in numerous domains for image classification: face recognition, industrial inspections, food science medical imaging among others. Taking into account color in the definition of these descriptors makes it possible to better characterize images. Color texture is thus the subject of recent work, and the analysis of color texture images is increasingly attracting interest in the scientific community. In optical remote sensing systems, sensors measure separately different parts of the electromagnetic spectrum; the visible ones and even those that are invisible to the human eye. The amounts of light reflected by the earth in spectral bands are then transformed into grayscale images. The primary natural colors Red (R) Green (G) and Blue (B) are then used in mixtures of different spectral bands in order to produce RGB images. Thus, good color texture discrimination can be achieved using RGB under controlled illumination conditions. Some previous works investigate the effect of using different color space for color texture classification. However, the selection of the best performing color space in land-sea segmentation is an open question. Its resolution may bring considerable improvements in certain applications like coastline detection, where the detection result is strongly dependent on the performance of the land-sea segmentation. The aim of this paper is to present the results of a study conducted on different color spaces in order to show the best-performing color space for land-sea segmentation. In this sense, an experimental analysis is carried out using five different color spaces (RGB, XYZ, Lab, HSV, YCbCr). For each color space, the Haar wavelet decomposition is used to extract different color texture features. These color texture features are then used for Fusion of Over Segmentation (FOOS) based classification; this allows segmentation of the land part from the sea one. By analyzing the different results of this study, the HSV color space is found as the best classification performance while using color and texture features; which is perfectly coherent with the results presented in the literature.Keywords: classification, coastline, color, sea-land segmentation
Procedia PDF Downloads 2484474 Evaluation of an Integrated Supersonic System for Inertial Extraction of CO₂ in Post-Combustion Streams of Fossil Fuel Operating Power Plants
Authors: Zarina Chokparova, Ighor Uzhinsky
Abstract:
Carbon dioxide emissions resulting from burning of the fossil fuels on large scales, such as oil industry or power plants, leads to a plenty of severe implications including global temperature raise, air pollution and other adverse impacts on the environment. Besides some precarious and costly ways for the alleviation of CO₂ emissions detriment in industrial scales (such as liquefaction of CO₂ and its deep-water treatment, application of adsorbents and membranes, which require careful consideration of drawback effects and their mitigation), one physically and commercially available technology for its capture and disposal is supersonic system for inertial extraction of CO₂ in after-combustion streams. Due to the flue gas with a carbon dioxide concentration of 10-15 volume percent being emitted from the combustion system, the waste stream represents a rather diluted condition at low pressure. The supersonic system induces a flue gas mixture stream to expand using a converge-and-diverge operating nozzle; the flow velocity increases to the supersonic ranges resulting in rapid drop of temperature and pressure. Thus, conversion of potential energy into the kinetic power causes a desublimation of CO₂. Solidified carbon dioxide can be sent to the separate vessel for further disposal. The major advantages of the current solution are its economic efficiency, physical stability, and compactness of the system, as well as needlessness of addition any chemical media. However, there are several challenges yet to be regarded to optimize the system: the way for increasing the size of separated CO₂ particles (as they are represented on a micrometers scale of effective diameter), reduction of the concomitant gas separated together with carbon dioxide and provision of CO₂ downstream flow purity. Moreover, determination of thermodynamic conditions of the vapor-solid mixture including specification of the valid and accurate equation of state remains to be an essential goal. Due to high speeds and temperatures reached during the process, the influence of the emitted heat should be considered, and the applicable solution model for the compressible flow need to be determined. In this report, a brief overview of the current technology status will be presented and a program for further evaluation of this approach is going to be proposed.Keywords: CO₂ sequestration, converging diverging nozzle, fossil fuel power plant emissions, inertial CO₂ extraction, supersonic post-combustion carbon dioxide capture
Procedia PDF Downloads 1414473 The Studies of the Sorption Capabilities of the Porous Microspheres with Lignin
Authors: M. Goliszek, M. Sobiesiak, O. Sevastyanova, B. Podkoscielna
Abstract:
Lignin is one of three main constituents of biomass together with cellulose and hemicellulose. It is a complex biopolymer, which contains a large number of functional groups, including aliphatic and aromatic hydroxyl groups, carbohylic groups and methoxy groups in its structure, that is why it shows potential capacities for process of sorption. Lignin is a highly cross-linked polymer with a three-dimentional structure which can provide large surface area and pore volumes. It can also posses better dispersion, diffusion and mass transfer behavior in a field of the removal of, e.g., heavy-metal-ions or aromatic pollutions. In this work emulsion-suspension copolymerization method, to synthesize the porous microspheres of divinylbenzene (DVB), styrene (St) and lignin was used. There are also microspheres without the addition of lignin for comparison. Before the copolymerization, modification lignin with methacryloyl chloride, to improve its reactivity with other monomers was done. The physico-chemical properties of the obtained microspheres, e.g., pore structures (adsorption-desorption measurements), thermal properties (DSC), tendencies to swell and the actual shapes were also studied. Due to well-developed porous structure and the presence of functional groups our materials may have great potential in sorption processes. To estimate the sorption capabilities of the microspheres towards phenol and its chlorinated derivatives the off-line SPE (solid-phase extraction) method is going to be applied. This method has various advantages, including low-cost, easy to use and enables the rapid measurements for a large number of chemicals. The efficiency of the materials in removing phenols from aqueous solution and in desorption processes will be evaluated.Keywords: microspheres, lignin, sorption, solid-phase extraction
Procedia PDF Downloads 1834472 Device for Reversible Hydrogen Isotope Storage with Aluminum Oxide Ceramic Case
Authors: Igor P. Maximkin, Arkady A. Yukhimchuk, Victor V. Baluev, Igor L. Malkov, Rafael K. Musyaev, Damir T. Sitdikov, Alexey V. Buchirin, Vasily V. Tikhonov
Abstract:
Minimization of tritium diffusion leakage when developing devices handling tritium-containing media is key problems whose solution will at least allow essential enhancement of radiation safety and minimization of diffusion losses of expensive tritium. One of the ways to solve this problem is to use Al₂O₃ high-strength non-porous ceramics as a structural material of the bed body. This alumina ceramics offers high strength characteristics, but its main advantages are low hydrogen permeability (as against the used structural material) and high dielectric properties. The latter enables direct induction heating of an hydride-forming metal without essential heating of the pressure and containment vessel. The use of alumina ceramics and induction heating allows: - essential reduction of tritium extraction time; - several orders reduction of tritium diffusion leakage; - more complete extraction of tritium from metal hydrides due to its higher heating up to melting in the event of final disposal of the device. The paper presents computational and experimental results for the tritium bed designed to absorb 6 liters of tritium. Titanium was used as hydrogen isotope sorbent. Results of hydrogen realize kinetic from hydride-forming metal, strength and cyclic service life tests are reported. Recommendations are also provided for the practical use of the given bed type.Keywords: aluminum oxide ceramic, hydrogen pressure, hydrogen isotope storage, titanium hydride
Procedia PDF Downloads 4074471 Image Retrieval Based on Multi-Feature Fusion for Heterogeneous Image Databases
Authors: N. W. U. D. Chathurani, Shlomo Geva, Vinod Chandran, Proboda Rajapaksha
Abstract:
Selecting an appropriate image representation is the most important factor in implementing an effective Content-Based Image Retrieval (CBIR) system. This paper presents a multi-feature fusion approach for efficient CBIR, based on the distance distribution of features and relative feature weights at the time of query processing. It is a simple yet effective approach, which is free from the effect of features' dimensions, ranges, internal feature normalization and the distance measure. This approach can easily be adopted in any feature combination to improve retrieval quality. The proposed approach is empirically evaluated using two benchmark datasets for image classification (a subset of the Corel dataset and Oliva and Torralba) and compared with existing approaches. The performance of the proposed approach is confirmed with the significantly improved performance in comparison with the independently evaluated baseline of the previously proposed feature fusion approaches.Keywords: feature fusion, image retrieval, membership function, normalization
Procedia PDF Downloads 3454470 Optimized Weight Selection of Control Data Based on Quotient Space of Multi-Geometric Features
Authors: Bo Wang
Abstract:
The geometric processing of multi-source remote sensing data using control data of different scale and different accuracy is an important research direction of multi-platform system for earth observation. In the existing block bundle adjustment methods, as the controlling information in the adjustment system, the approach using single observation scale and precision is unable to screen out the control information and to give reasonable and effective corresponding weights, which reduces the convergence and adjustment reliability of the results. Referring to the relevant theory and technology of quotient space, in this project, several subjects are researched. Multi-layer quotient space of multi-geometric features is constructed to describe and filter control data. Normalized granularity merging mechanism of multi-layer control information is studied and based on the normalized scale factor, the strategy to optimize the weight selection of control data which is less relevant to the adjustment system can be realized. At the same time, geometric positioning experiment is conducted using multi-source remote sensing data, aerial images, and multiclass control data to verify the theoretical research results. This research is expected to break through the cliché of the single scale and single accuracy control data in the adjustment process and expand the theory and technology of photogrammetry. Thus the problem to process multi-source remote sensing data will be solved both theoretically and practically.Keywords: multi-source image geometric process, high precision geometric positioning, quotient space of multi-geometric features, optimized weight selection
Procedia PDF Downloads 2854469 Influence of the Cooking Technique on the Iodine Content of Frozen Hake
Authors: F. Deng, R. Sanchez, A. Beltran, S. Maestre
Abstract:
The high nutritional value associated with seafood is related to the presence of essential trace elements. Moreover, seafood is considered an important source of energy, proteins, and long-chain polyunsaturated fatty acids. Generally, seafood is consumed cooked. Consequently, the nutritional value could be degraded. Seafood, such as fish, shellfish, and seaweed, could be considered as one of the main iodine sources. The deficient or excessive consumption of iodine could cause dysfunction and pathologies related to the thyroid gland. The main objective of this work is to evaluated iodine stability in hake (Merluccius) undergone different culinary techniques. The culinary process considered were: boiling, steaming, microwave cooking, baking, cooking en papillote (twisted cover with the shape of a sweet wrapper) and coating with a batter of flour and deep-frying. The determination of iodine was carried by Inductively Coupled Plasma Mass Spectrometry (ICP-MS). Regarding sample handling strategies, liquid-liquid extraction has demonstrated to be a powerful pre-concentration and clean-up approach for trace metal analysis by ICP techniques. Extraction with tetramethylammonium hydroxide (TMAH reagent) was used as a sample preparation method in this work. Based on the results, it can be concluded that the stability of iodine was degraded with the cooking processes. The major degradation was observed for the boiling and microwave cooking processes. The content of iodine in hake decreased up to 60% and 52%, respectively. However, if the boiling cooking liquid is preserved, this loss that has been generated during cooking is reduced. Only when the fish was cooked by following the cooking en papillote process the iodine content was preserved.Keywords: cooking process, ICP-MS, iodine, hake
Procedia PDF Downloads 1424468 Automatic Classification of the Stand-to-Sit Phase in the TUG Test Using Machine Learning
Authors: Yasmine Abu Adla, Racha Soubra, Milana Kasab, Mohamad O. Diab, Aly Chkeir
Abstract:
Over the past several years, researchers have shown a great interest in assessing the mobility of elderly people to measure their functional status. Usually, such an assessment is done by conducting tests that require the subject to walk a certain distance, turn around, and finally sit back down. Consequently, this study aims to provide an at home monitoring system to assess the patient’s status continuously. Thus, we proposed a technique to automatically detect when a subject sits down while walking at home. In this study, we utilized a Doppler radar system to capture the motion of the subjects. More than 20 features were extracted from the radar signals, out of which 11 were chosen based on their intraclass correlation coefficient (ICC > 0.75). Accordingly, the sequential floating forward selection wrapper was applied to further narrow down the final feature vector. Finally, 5 features were introduced to the linear discriminant analysis classifier, and an accuracy of 93.75% was achieved as well as a precision and recall of 95% and 90%, respectively.Keywords: Doppler radar system, stand-to-sit phase, TUG test, machine learning, classification
Procedia PDF Downloads 1614467 Developed CNN Model with Various Input Scale Data Evaluation for Bearing Faults Prognostics
Authors: Anas H. Aljemely, Jianping Xuan
Abstract:
Rolling bearing fault diagnosis plays a pivotal issue in the rotating machinery of modern manufacturing. In this research, a raw vibration signal and improved deep learning method for bearing fault diagnosis are proposed. The multi-dimensional scales of raw vibration signals are selected for evaluation condition monitoring system, and the deep learning process has shown its effectiveness in fault diagnosis. In the proposed method, employing an Exponential linear unit (ELU) layer in a convolutional neural network (CNN) that conducts the identical function on positive data, an exponential nonlinearity on negative inputs, and a particular convolutional operation to extract valuable features. The identification results show the improved method has achieved the highest accuracy with a 100-dimensional scale and increase the training and testing speed.Keywords: bearing fault prognostics, developed CNN model, multiple-scale evaluation, deep learning features
Procedia PDF Downloads 2104466 Calibration of Residential Buildings Energy Simulations Using Real Data from an Extensive in situ Sensor Network – A Study of Energy Performance Gap
Authors: Mathieu Bourdeau, Philippe Basset, Julien Waeytens, Elyes Nefzaoui
Abstract:
As residential buildings account for a third of the overall energy consumption and greenhouse gas emissions in Europe, building energy modeling is an essential tool to reach energy efficiency goals. In the energy modeling process, calibration is a mandatory step to obtain accurate and reliable energy simulations. Nevertheless, the comparison between simulation results and the actual building energy behavior often highlights a significant performance gap. The literature discusses different origins of energy performance gaps, from building design to building operation. Then, building operation description in energy models, especially energy usages and users’ behavior, plays an important role in the reliability of simulations but is also the most accessible target for post-occupancy energy management and optimization. Therefore, the present study aims to discuss results on the calibration ofresidential building energy models using real operation data. Data are collected through a sensor network of more than 180 sensors and advanced energy meters deployed in three collective residential buildings undergoing major retrofit actions. The sensor network is implemented at building scale and in an eight-apartment sample. Data are collected for over one year and half and coverbuilding energy behavior – thermal and electricity, indoor environment, inhabitants’ comfort, occupancy, occupants behavior and energy uses, and local weather. Building energy simulations are performed using a physics-based building energy modeling software (Pleaides software), where the buildings’features are implemented according to the buildingsthermal regulation code compliance study and the retrofit project technical files. Sensitivity analyses are performed to highlight the most energy-driving building features regarding each end-use. These features are then compared with the collected post-occupancy data. Energy-driving features are progressively replaced with field data for a step-by-step calibration of the energy model. Results of this study provide an analysis of energy performance gap on an existing residential case study under deep retrofit actions. It highlights the impact of the different building features on the energy behavior and the performance gap in this context, such as temperature setpoints, indoor occupancy, the building envelopeproperties but also domestic hot water usage or heat gains from electric appliances. The benefits of inputting field data from an extensive instrumentation campaign instead of standardized scenarios are also described. Finally, the exhaustive instrumentation solution provides useful insights on the needs, advantages, and shortcomings of the implemented sensor network for its replicability on a larger scale and for different use cases.Keywords: calibration, building energy modeling, performance gap, sensor network
Procedia PDF Downloads 1604465 Impacts of Climate Change and Natural Gas Operations on the Hydrology of Northeastern BC, Canada: Quantifying the Water Budget for Coles Lake
Authors: Sina Abadzadesahraei, Stephen Déry, John Rex
Abstract:
Climate research has repeatedly identified strong associations between anthropogenic emissions of ‘greenhouses gases’ and observed increases of global mean surface air temperature over the past century. Studies have also demonstrated that the degree of warming varies regionally. Canada is not exempt from this situation, and evidence is mounting that climate change is beginning to cause diverse impacts in both environmental and socio-economic spheres of interest. For example, northeastern British Columbia (BC), whose climate is controlled by a combination of maritime, continental and arctic influences, is warming at a greater rate than the remainder of the province. There are indications that these changing conditions are already leading to shifting patterns in the region’s hydrological cycle, and thus its available water resources. Coincident with these changes, northeastern BC is undergoing rapid development for oil and gas extraction: This depends largely on subsurface hydraulic fracturing (‘fracking’), which uses enormous amounts of freshwater. While this industrial activity has made substantial contributions to regional and provincial economies, it is important to ensure that sufficient and sustainable water supplies are available for all those dependent on the resource, including ecological systems. In this turn demands a comprehensive understanding of how water in all its forms interacts with landscapes, the atmosphere, and of the potential impacts of changing climatic conditions on these processes. The aim of this study is therefore to characterize and quantify all components of the water budget in the small watershed of Coles Lake (141.8 km², 100 km north of Fort Nelson, BC), through a combination of field observations and numerical modelling. Baseline information will aid the assessment of the sustainability of current and future plans for freshwater extraction by the oil and gas industry, and will help to maintain the precarious balance between economic and environmental well-being. This project is a perfect example of interdisciplinary research, in that it not only examines the hydrology of the region but also investigates how natural gas operations and growth can affect water resources. Therefore, a fruitful collaboration between academia, government and industry has been established to fulfill the objectives of this research in a meaningful manner. This project aims to provide numerous benefits to BC communities. Further, the outcome and detailed information of this research can be a huge asset to researchers examining the effect of climate change on water resources worldwide.Keywords: northeastern British Columbia, water resources, climate change, oil and gas extraction
Procedia PDF Downloads 2644464 Characterisation of Fractions Extracted from Sorghum Byproducts
Authors: Prima Luna, Afroditi Chatzifragkou, Dimitris Charalampopoulos
Abstract:
Sorghum byproducts, namely bran, stalk, and panicle are examples of lignocellulosic biomass. These raw materials contain large amounts of polysaccharides, in particular hemicelluloses, celluloses, and lignins, which if efficiently extracted, can be utilised for the development of a range of added value products with potential applications in agriculture and food packaging sectors. The aim of this study was to characterise fractions extracted from sorghum bran and stalk with regards to their physicochemical properties that could determine their applicability as food-packaging materials. A sequential alkaline extraction was applied for the isolation of cellulosic, hemicellulosic and lignin fractions from sorghum stalk and bran. Lignin content, phenolic content and antioxidant capacity were also investigated in the case of the lignin fraction. Thermal analysis using differential scanning calorimetry (DSC) and X-Ray Diffraction (XRD) revealed that the glass transition temperature (Tg) of cellulose fraction of the stalk was ~78.33 oC at amorphous state (~65%) and water content of ~5%. In terms of hemicellulose, the Tg value of stalk was slightly lower compared to bran at amorphous state (~54%) and had less water content (~2%). It is evident that hemicelluloses generally showed a lower thermal stability compared to cellulose, probably due to their lack of crystallinity. Additionally, bran had higher arabinose-to-xylose ratio (0.82) than the stalk, a fact that indicated its low crystallinity. Furthermore, lignin fraction had Tg value of ~93 oC at amorphous state (~11%). Stalk-derived lignin fraction contained more phenolic compounds (mainly consisting of p-coumaric and ferulic acid) and had higher lignin content and antioxidant capacity compared to bran-derived lignin fraction.Keywords: alkaline extraction, bran, cellulose, hemicellulose, lignin, stalk
Procedia PDF Downloads 2994463 Reminiscence Therapy for Alzheimer’s Disease Restrained on Logistic Regression Based Linear Bootstrap Aggregating
Authors: P. S. Jagadeesh Kumar, Mingmin Pan, Xianpei Li, Yanmin Yuan, Tracy Lin Huan
Abstract:
Researchers are doing enchanting research into the inherited features of Alzheimer’s disease and probable consistent therapies. In Alzheimer’s, memories are extinct in reverse order; memories formed lately are more transitory than those from formerly. Reminiscence therapy includes the conversation of past actions, trials and knowledges with another individual or set of people, frequently with the help of perceptible reminders such as photos, household and other acquainted matters from the past, music and collection of tapes. In this manuscript, the competence of reminiscence therapy for Alzheimer’s disease is measured using logistic regression based linear bootstrap aggregating. Logistic regression is used to envisage the experiential features of the patient’s memory through various therapies. Linear bootstrap aggregating shows better stability and accuracy of reminiscence therapy used in statistical classification and regression of memories related to validation therapy, supportive psychotherapy, sensory integration and simulated presence therapy.Keywords: Alzheimer’s disease, linear bootstrap aggregating, logistic regression, reminiscence therapy
Procedia PDF Downloads 3104462 Analysis of Different Resins in Web-to-Flange Joints
Authors: W. F. Ribeiro, J. L. N. Góes
Abstract:
The industrial process adds to engineering wood products features absent in solid wood, with homogeneous structure and reduced defects, improved physical and mechanical properties, bio-deterioration, resistance and better dimensional stability, improving quality and increasing the reliability of structures wood. These features combined with using fast-growing trees, make them environmentally ecological products, ensuring a strong consumer market. The wood I-joists are manufactured by the industrial profiles bonding flange and web, an important aspect of the production of wooden I-beams is the adhesive joint that bonds the web to the flange. Adhesives can effectively transfer and distribute stresses, thereby increasing the strength and stiffness of the composite. The objective of this study is to evaluate different resins in a shear strain specimens with the aim of analyzing the most efficient resin and possibility of using national products, reducing the manufacturing cost. First was conducted a literature review, where established the geometry and materials generally used, then established and analyzed 8 national resins and produced six specimens for each.Keywords: engineered wood products, structural resin, wood i-joist, Pinus taeda
Procedia PDF Downloads 2784461 YOLO-IR: Infrared Small Object Detection in High Noise Images
Authors: Yufeng Li, Yinan Ma, Jing Wu, Chengnian Long
Abstract:
Infrared object detection aims at separating small and dim target from clutter background and its capabilities extend beyond the limits of visible light, making it invaluable in a wide range of applications such as improving safety, security, efficiency, and functionality. However, existing methods are usually sensitive to the noise of the input infrared image, leading to a decrease in target detection accuracy and an increase in the false alarm rate in high-noise environments. To address this issue, an infrared small target detection algorithm called YOLO-IR is proposed in this paper to improve the robustness to high infrared noise. To address the problem that high noise significantly reduces the clarity and reliability of target features in infrared images, we design a soft-threshold coordinate attention mechanism to improve the model’s ability to extract target features and its robustness to noise. Since the noise may overwhelm the local details of the target, resulting in the loss of small target features during depth down-sampling, we propose a deep and shallow feature fusion neck to improve the detection accuracy. In addition, because the generalized Intersection over Union (IoU)-based loss functions may be sensitive to noise and lead to unstable training in high-noise environments, we introduce a Wasserstein-distance based loss function to improve the training of the model. The experimental results show that YOLO-IR achieves a 5.0% improvement in recall and a 6.6% improvement in F1-score over existing state-of-art model.Keywords: infrared small target detection, high noise, robustness, soft-threshold coordinate attention, feature fusion
Procedia PDF Downloads 744460 Plasma Collagen XVIII in Response to Intensive Aerobic Running and Aqueous Extraction of Black Crataegus Elbursensis in Male Rats
Authors: A. Abdi, A. Abbasi Daloee, A. Barari
Abstract:
Aim: The adaptations that occur in human body after doing exercises training are a factor to help healthy people stay away from certain diseases. One of the main adaptations is a change in blood circulation, especially in vessels. The increase of capillary density is dependent on the balance between angiogenic and angiostatic factors. Most studies show that the changes made to angiogenic developmental factors resulted from physical exercises indicate the low level of stimulators compared with inhibitors. It is believed that the plasma level of VEGF-A, the important angiogenic factor, is reduced after physical exercise. Findings indicate that the extract of crataegus plant reduces the platelet-derived growth factor receptor (PDGFR) autophosphorylation in human's fibroblast. More importantly, crataegus (1 to 100 mg in liter) clearly leads to the inhibition of PDGFR autophosphorylation in vascular smooth muscle cells (VSMCs). Angiogenesis is a process that can be classified into physiological and pathophysiological forms. collagen XVIII is a part of extracellular protein and heparan sulfate proteoglycans in vascular epithelial and endothelial basement membrane cause the release of endostatin from noncollagenous collagen XVIII. Endostatin inhibits the growth of endothelial cells, inhibits angiogenesis, weakens different types of cancer, and the growth of tumors. The purpose of the current study was to investigate the effect of intensive aerobic running with or without aqueous extraction of black Crataegus elbursensis on Collagen XVIII in male rats. Design: Thirty-two Wistar male rats (4-6 weeks old, 125-135 gr weight) were acquired from the Pasteur's Institute (Amol, Mazandaran), and randomly assigned into control (n = 16) and training (n = 16) groups. Rats were further divided into saline-control (SC) (n=8), saline-training (ST) (n=8), crataegus pentaegyna extraction -control (CPEC) (n=8), and crataegus pentaegyna extraction - training (CPET) (n=8). The control (SC and CPEC) groups remained sedentary; whereas the training groups underwent a high running exercise program. plasma were excised and immediately frozen in liquid nitrogen. Statistical analysis was performed using a one way analysis of variance and Tukey test. Significance was accepted at P = 0.05. Results: The results show that aerobic exercise group had the highest concentration collagen XVIII compared to other groups and then respectively black crataegus, training-crataegus and control groups. Conclusion: In general, researchers in this study concluded that the increase of collagen XVIII (albeit insignificant) as a result of physical activity and consumption of black crataegus extract could possibly serve as a regional inhibitor of angiogenesis and another evidence for the anti-cancer effects of physical activities. Since the research has not managed in this study to measure the amount of plasma endostatin, it is suggested that both indices are measured with important angiogenic factors so that we can have a more accurate interpretation of changes to angiogenic and angiostatic factors resulted from physical exercises.Keywords: aerobic running, Crataegus elbursensis, Collagen XVIII
Procedia PDF Downloads 3254459 Determination of a Novel Artificial Sweetener Advantame in Food by Liquid Chromatography Tandem Mass Spectrometry
Authors: Fangyan Li, Lin Min Lee, Hui Zhu Peh, Shoet Harn Chan
Abstract:
Advantame, a derivative of aspartame, is the latest addition to a family of low caloric and high potent dipeptide sweeteners which include aspartame, neotame and alitame. The use of advantame as a high-intensity sweetener in food was first accepted by Food Standards Australia New Zealand in 2011 and subsequently by US and EU food authorities in 2014, with the results from toxicity and exposure studies showing advantame poses no safety concern to the public at regulated levels. To our knowledge, currently there is barely any detailed information on the analytical method of advantame in food matrix, except for one report published in Japanese, stating a high performance liquid chromatography (HPLC) and liquid chromatography/ mass spectrometry (LC-MS) method with a detection limit at ppm level. However, the use of acid in sample preparation and instrumental analysis in the report raised doubt over the reliability of the method, as there is indication that stability of advantame is compromised under acidic conditions. Besides, the method may not be suitable for analyzing food matrices containing advantame at low ppm or sub-ppm level. In this presentation, a simple, specific and sensitive method for the determination of advantame in food is described. The method involved extraction with water and clean-up via solid phase extraction (SPE) followed by detection using liquid chromatography tandem mass spectrometry (LC-MS/MS) in negative electrospray ionization mode. No acid was used in the entire procedure. Single laboratory validation of the method was performed in terms of linearity, precision and accuracy. A low detection limit at ppb level was achieved. Satisfactory recoveries were obtained using spiked samples at three different concentration levels. This validated method could be used in the routine inspection of the advantame level in food.Keywords: advantame, food, LC-MS/MS, sweetener
Procedia PDF Downloads 4764458 Feasibility of Washing/Extraction Treatment for the Remediation of Deep-Sea Mining Trailings
Authors: Kyoungrean Kim
Abstract:
Importance of deep-sea mineral resources is dramatically increasing due to the depletion of land mineral resources corresponding to increasing human’s economic activities. Korea has acquired exclusive exploration licenses at four areas which are the Clarion-Clipperton Fracture Zone in the Pacific Ocean (2002), Tonga (2008), Fiji (2011) and Indian Ocean (2014). The preparation for commercial mining of Nautilus minerals (Canada) and Lockheed martin minerals (USA) is expected by 2020. The London Protocol 1996 (LP) under International Maritime Organization (IMO) and International Seabed Authority (ISA) will set environmental guidelines for deep-sea mining until 2020, to protect marine environment. In this research, the applicability of washing/extraction treatment for the remediation of deep-sea mining tailings was mainly evaluated in order to present preliminary data to develop practical remediation technology in near future. Polymetallic nodule samples were collected at the Clarion-Clipperton Fracture Zone in the Pacific Ocean, then stored at room temperature. Samples were pulverized by using jaw crusher and ball mill then, classified into 3 particle sizes (> 63 µm, 63-20 µm, < 20 µm) by using vibratory sieve shakers (Analysette 3 Pro, Fritsch, Germany) with 63 µm and 20 µm sieve. Only the particle size 63-20 µm was used as the samples for investigation considering the lower limit of ore dressing process which is tens to 100 µm. Rhamnolipid and sodium alginate as biosurfactant and aluminum sulfate which are mainly used as flocculant were used as environmentally friendly additives. Samples were adjusted to 2% liquid with deionized water then mixed with various concentrations of additives. The mixture was stirred with a magnetic bar during specific reaction times and then the liquid phase was separated by a centrifugal separator (Thermo Fisher Scientific, USA) under 4,000 rpm for 1 h. The separated liquid was filtered with a syringe and acrylic-based filter (0.45 µm). The extracted heavy metals in the filtered liquid were then determined using a UV-Vis spectrometer (DR-5000, Hach, USA) and a heat block (DBR 200, Hach, USA) followed by US EPA methods (8506, 8009, 10217 and 10220). Polymetallic nodule was mainly composed of manganese (27%), iron (8%), nickel (1.4%), cupper (1.3 %), cobalt (1.3%) and molybdenum (0.04%). Based on remediation standards of various countries, Nickel (Ni), Copper (Cu), Cadmium (Cd) and Zinc (Zn) were selected as primary target materials. Throughout this research, the use of rhamnolipid was shown to be an effective approach for removing heavy metals in samples originated from manganese nodules. Sodium alginate might also be one of the effective additives for the remediation of deep-sea mining tailings such as polymetallic nodules. Compare to the use of rhamnolipid and sodium alginate, aluminum sulfate was more effective additive at short reaction time within 4 h. Based on these results, sequencing particle separation, selective extraction/washing, advanced filtration of liquid phase, water treatment without dewatering and solidification/stabilization may be considered as candidate technologies for the remediation of deep-sea mining tailings.Keywords: deep-sea mining tailings, heavy metals, remediation, extraction, additives
Procedia PDF Downloads 1554457 Kocuria Keratitis: A Rare and Diagnostically Challenging Infection of the Cornea
Authors: Sarah Jacqueline Saram, Diya Baker, Jaishree Gandhewar
Abstract:
Named after the Slovakian microbiologist, Miroslav Kocur, the Kocuria spp. are an emerging cause of significant human infections. Their predilection for immunocompromised states, such as malignancy and metabolic disorders, is highlighted in the literature. The coagulase-negative, gram-positive cocci are commensals found in the skin and oropharynx of humans, and their growing presence as responsible organisms in ocular infections cannot be ignored. The severe, rapid, and unrelenting disease course associated with Kocuria keratitis is underlined in the literature. However, the clinical features are variable, which may impede making a diagnosis. Here, we describe a first account of an initial misdiagnosis due to reliance on subjective analysis features on a confocal microscope, which ultimately led to a delay in commencing the correct treatment. In documenting this, we hope to underline to clinicians the difficulties in recognising a Kocuria Rhizophilia keratitis due to its similar clinical presentation to an Acanthamoeba Keratitis, thus emphasizing the need for early investigations such as corneal scrapes to secure the correct diagnosis and prevent further harm and vision loss for the patient.Keywords: keratitis, cornea, infection, rare, Kocuria
Procedia PDF Downloads 544456 An Application to Predict the Best Study Path for Information Technology Students in Learning Institutes
Authors: L. S. Chathurika
Abstract:
Early prediction of student performance is an important factor to be gained academic excellence. Whatever the study stream in secondary education, students lay the foundation for higher studies during the first year of their degree or diploma program in Sri Lanka. The information technology (IT) field has certain improvements in the education domain by selecting specialization areas to show the talents and skills of students. These specializations can be software engineering, network administration, database administration, multimedia design, etc. After completing the first-year, students attempt to select the best path by considering numerous factors. The purpose of this experiment is to predict the best study path using machine learning algorithms. Five classification algorithms: decision tree, support vector machine, artificial neural network, Naïve Bayes, and logistic regression are selected and tested. The support vector machine obtained the highest accuracy, 82.4%. Then affecting features are recognized to select the best study path.Keywords: algorithm, classification, evaluation, features, testing, training
Procedia PDF Downloads 1194455 Critical Factors for Successful Adoption of Land Value Capture Mechanisms – An Exploratory Study Applied to Indian Metro Rail Context
Authors: Anjula Negi, Sanjay Gupta
Abstract:
Paradigms studied inform inadequacies of financial resources, be it to finance metro rails for construction or to meet operational revenues or to derive profits in the long term. Funding sustainability is far and wide for much-needed public transport modes, like urban rail or metro rails, to be successfully operated. India embarks upon a sustainable transport journey and has proposed metro rail systems countrywide. As an emerging economic leader, its fiscal constraints are paramount, and the land value capture (LVC) mechanism provides necessary support and innovation toward development. India’s metro rail policy promotes multiple methods of financing, including private-sector investments and public-private-partnership. The critical question that remains to be addressed is what factors can make such mechanisms work. Globally, urban rail is a revolution noted by many researchers as future mobility. Researchers in this study deep dive by way of literature review and empirical assessments into factors that can lead to the adoption of LVC mechanisms. It is understood that the adoption of LVC methods is in the nascent stages in India. Research posits numerous challenges being faced by metro rail agencies in raising funding and for incremental value capture. A few issues pertaining to land-based financing, inter alia: are long-term financing, inter-institutional coordination, economic/ market suitability, dedicated metro funds, land ownership issues, piecemeal approach to real estate development, property development legal frameworks, etc. The question under probe is what are the parameters that can lead to success in the adoption of land value capture (LVC) as a financing mechanism. This research provides insights into key parameters crucial to the adoption of LVC in the context of Indian metro rails. Researchers have studied current forms of LVC mechanisms at various metro rails of the country. This study is significant as little research is available on the adoption of LVC, which is applicable to the Indian context. Transit agencies, State Government, Urban Local Bodies, Policy makers and think tanks, Academia, Developers, Funders, Researchers and Multi-lateral agencies may benefit from this research to take ahead LVC mechanisms in practice. The study deems it imperative to explore and understand key parameters that impact the adoption of LVC. Extensive literature review and ratification by experts working in the metro rails arena were undertaken to arrive at parameters for the study. Stakeholder consultations in the exploratory factor analysis (EFA) process were undertaken for principal component extraction. 43 seasoned and specialized experts participated in a semi-structured questionnaire to scale the maximum likelihood on each parameter, represented by various types of stakeholders. Empirical data was collected on chosen eighteen parameters, and significant correlation was extracted for output descriptives and inferential statistics. Study findings reveal these principal components as institutional governance framework, spatial planning features, legal frameworks, funding sustainability features and fiscal policy measures. In particular, funding sustainability features highlight sub-variables of beneficiaries to pay and use of multiple revenue options towards success in LVC adoption. Researchers recommend incorporation of these variables during early stage in design and project structuring for success in adoption of LVC. In turn leading to improvements in revenue sustainability of a public transport asset and help in undertaking informed transport policy decisions.Keywords: Exploratory factor analysis, land value capture mechanism, financing metro rails, revenue sustainability, transport policy
Procedia PDF Downloads 824454 Impact of Map Generalization in Spatial Analysis
Authors: Lin Li, P. G. R. N. I. Pussella
Abstract:
When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.Keywords: generalization, GIS, scales, spatial analysis
Procedia PDF Downloads 3284453 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model
Authors: Sujay Kotwale, Ramasubba Reddy M.
Abstract:
Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost
Procedia PDF Downloads 1194452 A Predictive Machine Learning Model of the Survival of Female-led and Co-Led Small and Medium Enterprises in the UK
Authors: Mais Khader, Xingjie Wei
Abstract:
This research sheds light on female entrepreneurs by providing new insights on the survival predictions of companies led by females in the UK. This study aims to build a predictive machine learning model of the survival of female-led & co-led small & medium enterprises (SMEs) in the UK over the period 2000-2020. The predictive model built utilised a combination of financial and non-financial features related to both companies and their directors to predict SMEs' survival. These features were studied in terms of their contribution to the resultant predictive model. Five machine learning models are used in the modelling: Decision tree, AdaBoost, Naïve Bayes, Logistic regression and SVM. The AdaBoost model had the highest performance of the five models, with an accuracy of 73% and an AUC of 80%. The results show high feature importance in predicting companies' survival for company size, management experience, financial performance, industry, region, and females' percentage in management.Keywords: company survival, entrepreneurship, females, machine learning, SMEs
Procedia PDF Downloads 1014451 Metaphorical Perceptions of Middle School Students regarding Computer Games
Authors: Ismail Celik, Ismail Sahin, Fetah Eren
Abstract:
The computer, among the most important inventions of the twentieth century, has become an increasingly important component in our everyday lives. Computer games also have become increasingly popular among people day-by-day, owing to their features based on realistic virtual environments, audio and visual features, and the roles they offer players. In the present study, the metaphors students have for computer games are investigated, as well as an effort to fill the gap in the literature. Students were asked to complete the sentence—‘Computer game is like/similar to….because….’— to determine the middle school students’ metaphorical images of the concept for ‘computer game’. The metaphors created by the students were grouped in six categories, based on the source of the metaphor. These categories were ordered as ‘computer game as a means of entertainment’, ‘computer game as a beneficial means’, ‘computer game as a basic need’, ‘computer game as a source of evil’, ‘computer game as a means of withdrawal’, and ‘computer game as a source of addiction’, according to the number of metaphors they included.Keywords: computer game, metaphor, middle school students, virtual environments
Procedia PDF Downloads 5354450 The Study of Flood Resilient House in Ebo-Town
Authors: Alagie Salieu Nankey
Abstract:
Flood-resistant house is the key mechanism to withstand flood hazards in Ebo-Town. It emerged simple yet powerful way of mitigating flooding in the community of Ebo- Town. Even though there are different types of buildings, little is known yet how and why flood affects building severely. In this paper, we examine three different types of flood-resistant buildings that are suitable for Ebo Town. We gather content and contextual features from six (6) respondents and used this data set to identify factors that are significantly associated with the flood-resistant house. Moreover, we built a suitable design concept. We found that amongst all the theories studied in the literature study Slit or Elevated House is the most suitable building design in Ebo-Town and Pile foundation is the most appropriate foundation type in the study area. Amongst contextual features, local materials are the most economical materials for the proposed design. This research proposes a framework that explains the theoretical relationships between flood hazard zones and flood-resistant houses in Ebo Town. Moreover, this research informs the design of sense-making and analytics tools for the resistant house.Keywords: flood-resistant, slit, flood hazard zone, pile foundation
Procedia PDF Downloads 464449 Enabling Wire Arc Additive Manufacturing in Aircraft Landing Gear Production and Its Benefits
Authors: Jun Wang, Chenglei Diao, Emanuele Pagone, Jialuo Ding, Stewart Williams
Abstract:
As a crucial component in aircraft, landing gear systems are responsible for supporting the plane during parking, taxiing, takeoff, and landing. Given the need for high load-bearing capacity over extended periods, 300M ultra-high strength steel (UHSS) is often the material of choice for crafting these systems due to its exceptional strength, toughness, and fatigue resistance. In the quest for cost-effective and sustainable manufacturing solutions, Wire Arc Additive Manufacturing (WAAM) emerges as a promising alternative for fabricating 300M UHSS landing gears. This is due to its advantages in near-net-shape forming of large components, cost-efficiency, and reduced lead times. Cranfield University has conducted an extensive preliminary study on WAAM 300M UHSS, covering feature deposition, interface analysis, and post-heat treatment. Both Gas Metal Arc (GMA) and Plasma Transferred Arc (PTA)-based WAAM methods were explored, revealing their feasibility for defect-free manufacturing. However, as-deposited 300M features showed lower strength but higher ductility compared to their forged counterparts. Subsequent post-heat treatments were effective in normalising the microstructure and mechanical properties, meeting qualification standards. A 300M UHSS landing gear demonstrator was successfully created using PTA-based WAAM, showcasing the method's precision and cost-effectiveness. The demonstrator, measuring Ф200mm x 700mm, was completed in 16 hours, using 7 kg of material at a deposition rate of 1.3kg/hr. This resulted in a significant reduction in the Buy-to-Fly (BTF) ratio compared to traditional manufacturing methods, further validating WAAM's potential for this application. A "cradle-to-gate" environmental impact assessment, which considers the cumulative effects from raw material extraction to customer shipment, has revealed promising outcomes. Utilising Wire Arc Additive Manufacturing (WAAM) for landing gear components significantly reduces the need for raw material extraction and refinement compared to traditional subtractive methods. This, in turn, lessens the burden on subsequent manufacturing processes, including heat treatment, machining, and transportation. Our estimates indicate that the carbon footprint of the component could be halved when switching from traditional machining to WAAM. Similar reductions are observed in embodied energy consumption and other environmental impact indicators, such as emissions to air, water, and land. Additionally, WAAM offers the unique advantage of part repair by redepositing only the necessary material, a capability not available through conventional methods. Our research shows that WAAM-based repairs can drastically reduce environmental impact, even when accounting for additional transportation for repairs. Consequently, WAAM emerges as a pivotal technology for reducing environmental impact in manufacturing, aiding the industry in its crucial and ambitious journey towards Net Zero. This study paves the way for transformative benefits across the aerospace industry, as we integrate manufacturing into a hybrid solution that offers substantial savings and access to more sustainable technologies for critical component production.Keywords: WAAM, aircraft landing gear, microstructure, mechanical performance, life cycle assessment
Procedia PDF Downloads 1594448 The Grammatical Dictionary Compiler: A System for Kartvelian Languages
Authors: Liana Lortkipanidze, Nino Amirezashvili, Nino Javashvili
Abstract:
The purpose of the grammatical dictionary is to provide information on the morphological and syntactic characteristics of the basic word in the dictionary entry. The electronic grammatical dictionaries are used as a tool of automated morphological analysis for texts processing. The Georgian Grammatical Dictionary should contain grammatical information for each word: part of speech, type of declension/conjugation, grammatical forms of the word (paradigm), alternative variants of basic word/lemma. In this paper, we present the system for compiling the Georgian Grammatical Dictionary automatically. We propose dictionary-based methods for extending grammatical lexicons. The input lexicon contains only a few number of words with identical grammatical features. The extension is based on similarity measures between features of words; more precisely, we add words to the extended lexicons, which are similar to those, which are already in the grammatical dictionary. Our dictionaries are corpora-based, and for the compiling, we introduce the method for lemmatization of unknown words, i.e., words of which neither full form nor lemma is in the grammatical dictionary.Keywords: acquisition of lexicon, Georgian grammatical dictionary, lemmatization rules, morphological processor
Procedia PDF Downloads 1484447 Exploration of in-situ Product Extraction to Increase Triterpenoid Production in Saccharomyces Cerevisiae
Authors: Mariam Dianat Sabet Gilani, Lars M. Blank, Birgitta E. Ebert
Abstract:
Plant-derived lupane-type, pentacyclic triterpenoids are biologically active compounds that are highly interesting for applications in medical, pharmaceutical, and cosmetic industries. Due to the low abundance of these valuable compounds in their natural sources, and the environmentally harmful downstream process, alternative production methods, such as microbial cell factories, are investigated. Engineered Saccharomyces cerevisiae strains, harboring the heterologous genes for betulinic acid synthesis, can produce up to 2 g L-1 triterpenoids, showing high potential for large-scale production of triterpenoids. One limitation of the microbial synthesis is the intracellular product accumulation. It not only makes cell disruption a necessary step in the downstream processing but also limits productivity and product yield per cell. To overcome these restrictions, the aim of this study is to develop an in-situ extraction method, which extracts triterpenoids into a second organic phase. Such a continuous or sequential product removal from the biomass keeps the cells in an active state and enables extended production time or biomass recycling. After screening of twelve different solvents, selected based on product solubility, biocompatibility, as well as environmental and health impact, isopropyl myristate (IPM) was chosen as a suitable solvent for in-situ product removal from S. cerevisiae. Impedance-based single-cell analysis and off-gas measurement of carbon dioxide emission showed that cell viability and physiology were not affected by the presence of IPM. Initial experiments demonstrated that after the addition of 20 vol % IPM to cultures in the stationary phase, 40 % of the total produced triterpenoids were extracted from the cells into the organic phase. In future experiments, the application of IPM in a repeated batch process will be tested, where IPM is added at the end of each batch run to remove triterpenoids from the cells, allowing the same biocatalysts to be used in several sequential batch steps. Due to its high biocompatibility, the amount of IPM added to the culture can also be increased to more than 20 vol % to extract more than 40 % triterpenoids in the organic phase, allowing the cells to produce more triterpenoids. This highlights the potential for the development of a continuous large-scale process, which allows biocatalysts to produce intracellular products continuously without the necessity of cell disruption and without limitation of the cell capacity.Keywords: betulinic acid, biocompatible solvent, in-situ extraction, isopropyl myristate, process development, secondary metabolites, triterpenoids, yeast
Procedia PDF Downloads 153