Search results for: extraction techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8106

Search results for: extraction techniques

6246 Synthesis and Characterisation of Bio-Based Acetals Derived from Eucalyptus Oil

Authors: Kirstin Burger, Paul Watts, Nicole Vorster

Abstract:

Green chemistry focuses on synthesis which has a low negative impact on the environment. This research focuses on synthesizing novel compounds from an all-natural Eucalyptus citriodora oil. Eight novel plasticizer compounds are synthesized and optimized using flow chemistry technology. A precursor to one novel compound can be synthesized from the lauric acid present in coconut oil. Key parameters, such as catalyst screening and loading, reaction time, temperature, residence time using flow chemistry techniques is investigated. The compounds are characterised using GC-MS, FT-IR, 1H and 13C-NMR techniques, X-ray crystallography. The efficiency of the compounds is compared to two commercial plasticizers, i.e. Dibutyl phthalate and Eastman 168. Several PVC-plasticized film formulations are produced using the bio-based novel compounds. Tensile strength, stress at fracture and percentage elongation are tested. The property of having increasing plasticizer percentage in the film formulations is investigated, ranging from 3, 6, 9 and 12%. The diastereoisomers of each compound are separated and formulated into PVC films, and differences in tensile strength are measured. Leaching tests, flexibility, and change in glass transition temperatures for PVC-plasticized films is recorded. Research objective includes using these novel compounds as a green bio-plasticizer alternative in plastic products for infants. The inhibitory effect of the compounds on six pathogens effecting infants are studied, namely; Escherichia coli, Staphylococcus aureus, Shigella sonnei, Pseudomonas putida, Salmonella choleraesuis and Klebsiella oxytoca.

Keywords: bio-based compounds, plasticizer, tensile strength, microbiological inhibition , synthesis

Procedia PDF Downloads 163
6245 Seafloor and Sea Surface Modelling in the East Coast Region of North America

Authors: Magdalena Idzikowska, Katarzyna Pająk, Kamil Kowalczyk

Abstract:

Seafloor topography is a fundamental issue in geological, geophysical, and oceanographic studies. Single-beam or multibeam sonars attached to the hulls of ships are used to emit a hydroacoustic signal from transducers and reproduce the topography of the seabed. This solution provides relevant accuracy and spatial resolution. Bathymetric data from ships surveys provides National Centers for Environmental Information – National Oceanic and Atmospheric Administration. Unfortunately, most of the seabed is still unidentified, as there are still many gaps to be explored between ship survey tracks. Moreover, such measurements are very expensive and time-consuming. The solution is raster bathymetric models shared by The General Bathymetric Chart of the Oceans. The offered products are a compilation of different sets of data - raw or processed. Indirect data for the development of bathymetric models are also measurements of gravity anomalies. Some forms of seafloor relief (e.g. seamounts) increase the force of the Earth's pull, leading to changes in the sea surface. Based on satellite altimetry data, Sea Surface Height and marine gravity anomalies can be estimated, and based on the anomalies, it’s possible to infer the structure of the seabed. The main goal of the work is to create regional bathymetric models and models of the sea surface in the area of the east coast of North America – a region of seamounts and undulating seafloor. The research includes an analysis of the methods and techniques used, an evaluation of the interpolation algorithms used, model thickening, and the creation of grid models. Obtained data are raster bathymetric models in NetCDF format, survey data from multibeam soundings in MB-System format, and satellite altimetry data from Copernicus Marine Environment Monitoring Service. The methodology includes data extraction, processing, mapping, and spatial analysis. Visualization of the obtained results was carried out with Geographic Information System tools. The result is an extension of the state of the knowledge of the quality and usefulness of the data used for seabed and sea surface modeling and knowledge of the accuracy of the generated models. Sea level is averaged over time and space (excluding waves, tides, etc.). Its changes, along with knowledge of the topography of the ocean floor - inform us indirectly about the volume of the entire water ocean. The true shape of the ocean surface is further varied by such phenomena as tides, differences in atmospheric pressure, wind systems, thermal expansion of water, or phases of ocean circulation. Depending on the location of the point, the higher the depth, the lower the trend of sea level change. Studies show that combining data sets, from different sources, with different accuracies can affect the quality of sea surface and seafloor topography models.

Keywords: seafloor, sea surface height, bathymetry, satellite altimetry

Procedia PDF Downloads 59
6244 Suitability Verification of Cellulose Nanowhisker as a Scaffold for Bone Tissue Engineering

Authors: Moon Hee Jung, Dae Seung Kim, Sang-Myung Jung, Gwang Heum Yoon, Hoo Cheol Lee, Hwa Sung Shin

Abstract:

Scaffolds are an important part to support growth and differentiation of osteoblast for regeneration of injured bone in bone tissue engineering. We utilized tunicate cellulose nanowhisker (CNW) as scaffold and developed complex system that can enhance differentiation of osteoblast by applying mechanical stimulation. CNW, a crystal form of cellulose, has high stiffness with a large surface area and is useful as a biomedical material due to its biodegradability and biocompatibility. In this study, CNW was obtained from tunicate extraction and was confirmed for its adhesion, differentiation, growth of osteoblast without cytotoxicity. In addition, osteoblast was successfully differentiated under mechanical stimulation, followed by calcium dependent signaling. In conclusion, we verified suitability of CNW as scaffold and possibility of bone substitutes.

Keywords: osteoblast, cellulose nanowhisker, CNW, mechanical stimulation, bone tissue engineering, bone substitute

Procedia PDF Downloads 347
6243 Production of Biodiesel from Melon Seed Oil Using Sodium Hydroxide as a Catalyst

Authors: Ene Rosemary Ndidiamaka, Nwangwu Florence Chinyere

Abstract:

The physiochemical properties of the melon seed oil was studied to determine its potentials as viable feed stock for biodisel production. The melon seed was extracted by solvent extraction using n-hexane as the extracting solvent. In this research, methanol was the alcohol used in the production of biodiesel, although alcohols like ethanol, propanol may also be used. Sodium hydroxide was employed for the catalysis. The melon seed oil was characterized for specific gravity, pH, ash content, iodine value, acid value, saponification value, peroxide value, free fatty acid value, flash point, viscosity, and refractive index using standard methods. The melon seed oil had very high oil content. Specific gravity and flash point of the oil is satisfactory. However, moisture content of the oil exceeded the stipulated ASRTM standard for biodiesel production. The overall results indicates that the melon seed oil is suitable for single-stage transesterification process to biodiesel production.

Keywords: biodiesel, catalyst, melon seed, transesterification

Procedia PDF Downloads 343
6242 Python Implementation for S1000D Applicability Depended Processing Model - SALERNO

Authors: Theresia El Khoury, Georges Badr, Amir Hajjam El Hassani, Stéphane N’Guyen Van Ky

Abstract:

The widespread adoption of machine learning and artificial intelligence across different domains can be attributed to the digitization of data over several decades, resulting in vast amounts of data, types, and structures. Thus, data processing and preparation turn out to be a crucial stage. However, applying these techniques to S1000D standard-based data poses a challenge due to its complexity and the need to preserve logical information. This paper describes SALERNO, an S1000d AppLicability dEpended pRocessiNg mOdel. This python-based model analyzes and converts the XML S1000D-based files into an easier data format that can be used in machine learning techniques while preserving the different logic and relationships in files. The model parses the files in the given folder, filters them, and extracts the required information to be saved in appropriate data frames and Excel sheets. Its main idea is to group the extracted information by applicability. In addition, it extracts the full text by replacing internal and external references while maintaining the relationships between files, as well as the necessary requirements. The resulting files can then be saved in databases and used in different models. Documents in both English and French languages were tested, and special characters were decoded. Updates on the technical manuals were taken into consideration as well. The model was tested on different versions of the S1000D, and the results demonstrated its ability to effectively handle the applicability, requirements, references, and relationships across all files and on different levels.

Keywords: aeronautics, big data, data processing, machine learning, S1000D

Procedia PDF Downloads 105
6241 Measurement of Coal Fineness, Air Fuel Ratio, and Fuel Weight Distribution in a Vertical Spindle Mill’s Pulverized Fuel Pipes at Classifier Vane 40%

Authors: Jayasiler Kunasagaram

Abstract:

In power generation, coal fineness is crucial to maintain flame stability, ensure combustion efficiency, and lower emissions to the environment. In order for the pulverized coal to react effectively in the boiler furnace, the size of coal particles needs to be at least 70% finer than 74 μm. This paper presents the experiment results of coal fineness, air fuel ratio and fuel weight distribution in pulverized fuel pipes at classifier vane 40%. The aim of this experiment is to extract the pulverized coal is kinetically and investigate the data accordingly. Dirty air velocity, coal sample extraction, and coal sieving experiments were performed to measure coal fineness. The experiment results show that required coal fineness can be achieved at 40 % classifier vane. However, this does not surpass the desired value by a great margin.

Keywords: coal power, emissions, isokinetic sampling, power generation

Procedia PDF Downloads 582
6240 Helicopter Exhaust Gases Cooler in Terms of Computational Fluid Dynamics (CFD) Analysis

Authors: Mateusz Paszko, Ksenia Siadkowska

Abstract:

Due to the low-altitude and relatively low-speed flight, helicopters are easy targets for actual combat assets e.g. infrared-guided missiles. Current techniques aim to increase the combat effectiveness of the military helicopters. Protection of the helicopter in flight from early detection, tracking and finally destruction can be realized in many ways. One of them is cooling hot exhaust gasses, emitting from the engines to the atmosphere in special heat exchangers. Nowadays, this process is realized in ejective coolers, where strong heat and momentum exchange between hot exhaust gases and cold air ejected from atmosphere takes place. Flow effects of air, exhaust gases; mixture of those two and the heat transfer between cold air and hot exhaust gases are given by differential equations of: Mass transportation–flow continuity, ejection of cold air through expanding exhaust gasses, conservation of momentum, energy and physical relationship equations. Calculation of those processes in ejective cooler by means of classic mathematical analysis is extremely hard or even impossible. Because of this, it is necessary to apply the numeric approach with modern, numeric computer programs. The paper discussed the general usability of the Computational Fluid Dynamics (CFD) in a process of projecting the ejective exhaust gases cooler cooperating with helicopter turbine engine. In this work, the CFD calculations have been performed for ejective-based cooler cooperating with the PA W3 helicopter’s engines.

Keywords: aviation, CFD analysis, ejective-cooler, helicopter techniques

Procedia PDF Downloads 306
6239 Catalytic and Non-Catalytic Pyrolysis of Walnut Shell Waste to Biofuel: Characterisation of Catalytic Biochar and Biooil

Authors: Saimatun Nisa

Abstract:

Walnut is an important export product from the Union Territory of Jammy and Kashmir. After extraction of the kernel, the walnut shell forms a solid waste that needs to be managed. Pyrolysis is one interesting option for the utilization of this walnut waste. In this study microwave pyrolysis reactor is used to convert the walnut shell biomass into its value-added products. Catalytic and non-catalytic conversion of walnut shell waste to oil, gas and char was evaluated using a Co-based catalyst. The catalyst was characterized using XPS and SEM analysis. Pyrolysis temperature, reaction time, particle size and sweeping gas (N₂) flow rate were set in the ranges of 400–600 °C, 40 min, <0.6mm to < 4.75mm and 300 ml min−1, respectively. The heating rate was fixed at 40 °C min−1. Maximum gas yield was obtained at 600 °C, 40 min, particle size range 1.18-2.36, 0.5 molar catalytic as 45.2%. The liquid product catalytic and non-catalytic was characterized by GC–MS analyses. In addition, the solid product was analyzed by means of FTIR & SEM.

Keywords: walnut shell, biooil, biochar, microwave pyrolysis

Procedia PDF Downloads 25
6238 A Convolutional Neural Network-Based Model for Lassa fever Virus Prediction Using Patient Blood Smear Image

Authors: A. M. John-Otumu, M. M. Rahman, M. C. Onuoha, E. P. Ojonugwa

Abstract:

A Convolutional Neural Network (CNN) model for predicting Lassa fever was built using Python 3.8.0 programming language, alongside Keras 2.2.4 and TensorFlow 2.6.1 libraries as the development environment in order to reduce the current high risk of Lassa fever in West Africa, particularly in Nigeria. The study was prompted by some major flaws in existing conventional laboratory equipment for diagnosing Lassa fever (RT-PCR), as well as flaws in AI-based techniques that have been used for probing and prognosis of Lassa fever based on literature. There were 15,679 blood smear microscopic image datasets collected in total. The proposed model was trained on 70% of the dataset and tested on 30% of the microscopic images in avoid overfitting. A 3x3x3 convolution filter was also used in the proposed system to extract features from microscopic images. The proposed CNN-based model had a recall value of 96%, a precision value of 93%, an F1 score of 95%, and an accuracy of 94% in predicting and accurately classifying the images into clean or infected samples. Based on empirical evidence from the results of the literature consulted, the proposed model outperformed other existing AI-based techniques evaluated. If properly deployed, the model will assist physicians, medical laboratory scientists, and patients in making accurate diagnoses for Lassa fever cases, allowing the mortality rate due to the Lassa fever virus to be reduced through sound decision-making.

Keywords: artificial intelligence, ANN, blood smear, CNN, deep learning, Lassa fever

Procedia PDF Downloads 89
6237 Effect of Highway Construction on Soil Properties and Soil Organic Carbon (Soc) Along Lagos-Badagry Expressway, Lagos, Nigeria

Authors: Fatai Olakunle Ogundele

Abstract:

Road construction is increasingly common in today's world as human development expands and people increasingly rely on cars for transportation on a daily basis. The construction of a large network of roads has dramatically altered the landscape and impacted well-being in a number of deleterious ways. In addition, the road can also shift population demographics and be a source of pollution into the environment. Road construction activities normally result in changes in alteration of the soil's physical properties through soil compaction on the road itself and on adjacent areas and chemical and biological properties, among other effects. Understanding roadside soil properties that are influenced by road construction activities can serve as a basis for formulating conservation-based management strategies. Therefore, this study examined the effects of road construction on soil properties and soil organic carbon along Lagos Badagry Expressway, Lagos, Nigeria. The study adopted purposive sampling techniques and 40 soil samples were collected at a depth of 0 – 30cm from each of the identified road intersections and infrastructures using a soil auger. The soil samples collected were taken to the laboratory for soil properties and carbon stock analysis using standard methods. Both descriptive and inferential statistical techniques were applied to analyze the data obtained. The results revealed that soil compaction inhibits ecological succession on roadsides in that increased compaction suppresses plant growth as well as causes changes in soil quality.

Keywords: highway, soil properties, organic carbon, road construction, land degradation

Procedia PDF Downloads 50
6236 A Flute Tracking System for Monitoring the Wear of Cutting Tools in Milling Operations

Authors: Hatim Laalej, Salvador Sumohano-Verdeja, Thomas McLeay

Abstract:

Monitoring of tool wear in milling operations is essential for achieving the desired dimensional accuracy and surface finish of a machined workpiece. Although there are numerous statistical models and artificial intelligence techniques available for monitoring the wear of cutting tools, these techniques cannot pin point which cutting edge of the tool, or which insert in the case of indexable tooling, is worn or broken. Currently, the task of monitoring the wear on the tool cutting edges is carried out by the operator who performs a manual inspection, causing undesirable stoppages of machine tools and consequently resulting in costs incurred from lost productivity. The present study is concerned with the development of a flute tracking system to segment signals related to each physical flute of a cutter with three flutes used in an end milling operation. The purpose of the system is to monitor the cutting condition for individual flutes separately in order to determine their progressive wear rates and to predict imminent tool failure. The results of this study clearly show that signals associated with each flute can be effectively segmented using the proposed flute tracking system. Furthermore, the results illustrate that by segmenting the sensor signal by flutes it is possible to investigate the wear in each physical cutting edge of the cutting tool. These findings are significant in that they facilitate the online condition monitoring of a cutting tool for each specific flute without the need for operators/engineers to perform manual inspections of the tool.

Keywords: machining, milling operation, tool condition monitoring, tool wear prediction

Procedia PDF Downloads 283
6235 ACBM: Attention-Based CNN and Bi-LSTM Model for Continuous Identity Authentication

Authors: Rui Mao, Heming Ji, Xiaoyu Wang

Abstract:

Keystroke dynamics are widely used in identity recognition. It has the advantage that the individual typing rhythm is difficult to imitate. It also supports continuous authentication through the keyboard without extra devices. The existing keystroke dynamics authentication methods based on machine learning have a drawback in supporting relatively complex scenarios with massive data. There are drawbacks to both feature extraction and model optimization in these methods. To overcome the above weakness, an authentication model of keystroke dynamics based on deep learning is proposed. The model uses feature vectors formed by keystroke content and keystroke time. It ensures efficient continuous authentication by cooperating attention mechanisms with the combination of CNN and Bi-LSTM. The model has been tested with Open Data Buffalo dataset, and the result shows that the FRR is 3.09%, FAR is 3.03%, and EER is 4.23%. This proves that the model is efficient and accurate on continuous authentication.

Keywords: keystroke dynamics, identity authentication, deep learning, CNN, LSTM

Procedia PDF Downloads 131
6234 A Machine Learning Based Framework for Education Levelling in Multicultural Countries: UAE as a Case Study

Authors: Shatha Ghareeb, Rawaa Al-Jumeily, Thar Baker

Abstract:

In Abu Dhabi, there are many different education curriculums where sector of private schools and quality assurance is supervising many private schools in Abu Dhabi for many nationalities. As there are many different education curriculums in Abu Dhabi to meet expats’ needs, there are different requirements for registration and success. In addition, there are different age groups for starting education in each curriculum. In fact, each curriculum has a different number of years, assessment techniques, reassessment rules, and exam boards. Currently, students that transfer curriculums are not being placed in the right year group due to different start and end dates of each academic year and their date of birth for each year group is different for each curriculum and as a result, we find students that are either younger or older for that year group which therefore creates gaps in their learning and performance. In addition, there is not a way of storing student data throughout their academic journey so that schools can track the student learning process. In this paper, we propose to develop a computational framework applicable in multicultural countries such as UAE in which multi-education systems are implemented. The ultimate goal is to use cloud and fog computing technology integrated with Artificial Intelligence techniques of Machine Learning to aid in a smooth transition when assigning students to their year groups, and provide leveling and differentiation information of students who relocate from a particular education curriculum to another, whilst also having the ability to store and access student data from anywhere throughout their academic journey.

Keywords: admissions, algorithms, cloud computing, differentiation, fog computing, levelling, machine learning

Procedia PDF Downloads 118
6233 In Vitro Antifungal Activity of Essential Oil Artemisia Absinthium

Authors: Bouchenak Fatima, Lmegharbi Abdelbaki, Houssem Degaichia, Benrebiha Fatima

Abstract:

The essential oil composition of the leaf of Artemisia absinthium from region of Cherchell (The south of Algeria) was investigated by GC, GC-MS. 27 constituents were identified correspond to 84, 63% of the total oil. The major components are Thujone (60, 82%), Chamazulènel (16, 62%), ρ-cymène (4, 29%) and 2-carène (4.25%). The antimicrobial activity of oil was tested in vitro by two methods (agar diffusion and microdilution) on three plant pathogenic fungi. This oil has been tested for antimicrobial activity against three pathogenic fungi (Botrytis cinerea, Fusarium culmorum and Helminthosporium Sp.).The study of activity was evaluated by two methods: Method of diffusion in gelose and the minimum inhibitory concentration MIC. This oil exhibited an interesting antimicrobial activity. A preliminary study showed that this oil presented high toxicity against this fungus. These results, although preliminary show a good antifungal activity, to limit and inhibit stop the development of those pathogen agent.

Keywords: artemisia absinthian, extraction process, chemical study, antifungal activity

Procedia PDF Downloads 454
6232 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows

Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid

Abstract:

Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.

Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil

Procedia PDF Downloads 113
6231 Surface Characterization of Zincblende and Wurtzite Semiconductors Using Nonlinear Optics

Authors: Hendradi Hardhienata, Tony Sumaryada, Sri Setyaningsih

Abstract:

Current progress in the field of nonlinear optics has enabled precise surface characterization in semiconductor materials. Nonlinear optical techniques are favorable due to their nondestructive measurement and ability to work in nonvacuum and ambient conditions. The advance of the bond hyperpolarizability models opens a wide range of nanoscale surface investigation including the possibility to detect molecular orientation at the surface of silicon and zincblende semiconductors, investigation of electric field induced second harmonic fields at the semiconductor interface, detection of surface impurities, and very recently, study surface defects such as twin boundary in wurtzite semiconductors. In this work, we show using nonlinear optical techniques, e.g. nonlinear bond models how arbitrary polarization of the incoming electric field in Rotational Anisotropy Spectroscopy experiments can provide more information regarding the origin of the nonlinear sources in zincblende and wurtzite semiconductor structure. In addition, using hyperpolarizability consideration, we describe how the nonlinear susceptibility tensor describing SHG can be well modelled using only few parameter because of the symmetry of the bonds. We also show how the third harmonic intensity feature shows considerable changes when the incoming field polarization angle is changed from s-polarized to p-polarized. We also propose a method how to investigate surface reconstruction and defects in wurtzite and zincblende structure at the nanoscale level.

Keywords: surface characterization, bond model, rotational anisotropy spectroscopy, effective hyperpolarizability

Procedia PDF Downloads 138
6230 Comparative Study on Sensory Profiles of Liquor from Different Dried Cocoa Beans

Authors: Khairul Bariah Sulaiman, Tajul Aris Yang

Abstract:

Malaysian dried cocoa beans have been reported to have low quality flavour and are often sold at discounted prices. Various efforts have been made to improve the Malaysian beans quality. Among these efforts is introduction of the shallow box fermentation technique and pulp preconditioned through pods storage. However, after nearly four decades of the effort was done, Malaysian cocoa farmers still received lower prices for their beans. So, this study was carried out in order to assess the flavour quality of dried cocoa beans produced by shallow box fermentation techniques, combination of shallow box fermentation with pods storage and compared to dried cocoa beans obtained from Ghana. A total of eight samples of dried cocoa was used in this study, which one of the samples was Ghanaian beans (coded with no.8), while the rest were Malaysian cocoa beans with different post-harvest processing (coded with no. 1, 2, 3, 4, 5, 6 and 7). Cocoa liquor was prepared from all samples in the prescribed techniques and sensory evaluation was carried out using Quantitative Descriptive Analysis (QDA) Method with 0-10 scale by Malaysian Cocoa Board trained panelist. Sensory evaluation showed that cocoa attributes for all cocoa liquors ranging from 3.5 to 5.3, whereas bitterness was ranging from 3.4 to 4.6 and astringent attribute ranging from 3.9 to 5.5, respectively. Meanwhile, all cocoa liquors were having acid or sourness attribute ranging from 1.6 to 3.6, respectively. In general cocoa liquor prepared from sample coded no 4 has almost similar flavour profile and no significantly different at p < 0.05 with Ghana, in term of most flavour attributes as compared to the other six samples.

Keywords: cocoa beans, flavour, fermentation, shallow box, pods storage

Procedia PDF Downloads 366
6229 Efficient Microspore Isolation Methods for High Yield Embryoids and Regeneration in Rice (Oryza sativa L.)

Authors: S. M. Shahinul Islam, Israt Ara, Narendra Tuteja, Sreeramanan Subramaniam

Abstract:

Through anther and microspore culture methods, complete homozygous plants can be produced within a year as compared to the long inbreeding method. Isolated microspore culture is one of the most important techniques for rapid development of haploid plants. The efficiency of this method is influenced by several factors such as cultural conditions, growth regulators, plant media, pretreatments, physical and growth conditions of the donor plants, pollen isolation procedure, etc. The main purpose of this study was to improve the isolated microspore culture protocol in order to increase the efficiency of embryoids, its regeneration and reducing albinisms. Under this study we have tested mainly three different microspore isolation procedures by glass rod, homozeniger and by blending and found the efficiency on gametic embryogenesis. There are three types of media viz. washing, pre-culture and induction was used. The induction medium as AMC (modified MS) supplemented by 2, 4-D (2.5 mg/l), kinetin (0.5 mg/l) and higher amount of D-Manitol (90 g/l) instead of sucrose and two types of amino acids (L-glutamine and L-serine) were used. Out of three main microspore isolation procedure by homogenizer isolation (P4) showed best performance on ELS induction (177%) and green plantlets (104%) compared with other techniques. For all cases albinisims occurred but microspore isolation from excised anthers by glass rod and homogenizer showed lesser numbers of albino plants that was also one of the important findings in this study.

Keywords: androgenesis, pretreatment, microspore culture, regeneration, albino plants, Oryza sativa

Procedia PDF Downloads 339
6228 Ant Lion Optimization in a Fuzzy System for Benchmark Control Problem

Authors: Leticia Cervantes, Edith Garcia, Oscar Castillo

Abstract:

At today, there are several control problems where the main objective is to obtain the best control in the study to decrease the error in the application. Many techniques can use to control these problems such as Neural Networks, PID control, Fuzzy Logic, Optimization techniques and many more. In this case, fuzzy logic with fuzzy system and an optimization technique are used to control the case of study. In this case, Ant Lion Optimization is used to optimize a fuzzy system to control the velocity of a simple treadmill. The main objective is to achieve the control of the velocity in the control problem using the ALO optimization. First, a simple fuzzy system was used to control the velocity of the treadmill it has two inputs (error and error change) and one output (desired speed), then results were obtained but to decrease the error the ALO optimization was developed to optimize the fuzzy system of the treadmill. Having the optimization, the simulation was performed, and results can prove that using the ALO optimization the control of the velocity was better than a conventional fuzzy system. This paper describes some basic concepts to help to understand the idea in this work, the methodology of the investigation (control problem, fuzzy system design, optimization), the results are presented and the optimization is used for the fuzzy system. A comparison between the simple fuzzy system and the optimized fuzzy systems are presented where it can be proving the optimization improved the control with good results the major findings of the study is that ALO optimization is a good alternative to improve the control because it helped to decrease the error in control applications even using any control technique to optimized, As a final statement is important to mentioned that the selected methodology was good because the control of the treadmill was improve using the optimization technique.

Keywords: ant lion optimization, control problem, fuzzy control, fuzzy system

Procedia PDF Downloads 374
6227 Evaluating the Radiation Dose Involved in Interventional Radiology Procedures

Authors: Kholood Baron

Abstract:

Radiologic interventional studies use fluoroscopy imaging guidance to perform both diagnostic and therapeutic procedures. These could result in high radiation doses being delivered to the patients and also to the radiology team. This is due to the prolonged fluoroscopy time and the large number of images taken, even when dose-minimizing techniques and modern fluoroscopic tools are applied. Hence, these procedures are part of the everyday routine of interventional radiology doctors, assistant nurses, and radiographers. Thus, it is important to estimate the radiation exposure dose they received in order to give objective advice and reduce both patient and radiology team radiation exposure dose. The aim of this study was to find out the total radiation dose reaching the radiologist and the patient during an interventional procedure and to determine the impact of certain parameters on the patient dose. Method: The radiation dose was measured by TLD devices (thermoluminescent dosimeter; radiation dosimeter device). Physicians, patients, nurses, and radiographers wore TLDs during 12 interventional radiology procedures performed in two hospitals, Mubarak and Chest Hospital. This study highlights the need for interventional radiologists to be mindful of the radiation doses received by both patients and medical staff during interventional radiology procedures. The findings emphasize the impact of factors such as fluoroscopy duration and the number of images taken on the patient dose. By raising awareness and providing insights into optimizing techniques and protective measures, this research contributes to the overall goal of reducing radiation doses and ensuring the safety of patients and medical staff.

Keywords: dosimetry, radiation dose, interventional radiology procedures, patient radiation dose

Procedia PDF Downloads 85
6226 Challenges and Opportunities: One Stop Processing for the Automation of Indonesian Large-Scale Topographic Base Map Using Airborne LiDAR Data

Authors: Elyta Widyaningrum

Abstract:

The LiDAR data acquisition has been recognizable as one of the fastest solution to provide the basis data for topographic base mapping in Indonesia. The challenges to accelerate the provision of large-scale topographic base maps as a development plan basis gives the opportunity to implement the automated scheme in the map production process. The one stop processing will also contribute to accelerate the map provision especially to conform with the Indonesian fundamental spatial data catalog derived from ISO 19110 and geospatial database integration. Thus, the automated LiDAR classification, DTM generation and feature extraction will be conducted in one GIS-software environment to form all layers of topographic base maps. The quality of automated topographic base map will be assessed and analyzed based on its completeness, correctness, contiguity, consistency and possible customization.

Keywords: automation, GIS environment, LiDAR processing, map quality

Procedia PDF Downloads 345
6225 Cut-Out Animation as an Technic and Development inside History Process

Authors: Armagan Gokcearslan

Abstract:

The art of animation has developed very rapidly from the aspects of script, sound and music, motion, character design, techniques being used and technological tools being developed since the first years until today. Technical variety attracts a particular attention in the art of animation. Being perceived as a kind of illusion in the beginning; animations commonly used the Flash Sketch technique. Animations artists using the Flash Sketch technique created scenes by drawing them on a blackboard with chalk. The Flash Sketch technique was used by primary animation artists like Emile Cohl, Winsor McCay ande Blackton. And then tools like Magical Lantern, Thaumatrope, Phenakisticope, and Zeotrap were developed and started to be used intensely in the first years of the art of animation. Today, on the other hand, the art of animation is affected by developments in the computer technology. It is possible to create three-dimensional and two-dimensional animations with the help of various computer software. Cut-out technique is among the important techniques being used in the art of animation. Cut-out animation technique is based on the art of paper cutting. Examining cut-out animations; it is observed that they technically resemble the art of paper cutting. The art of paper cutting has a rooted history. It is possible to see the oldest samples of paper cutting in the People’s Republic of China in the period after the 2. century B.C. when the Chinese invented paper. The most popular artist using the cut-out animation technique is the German artist Lotte Reiniger. This study titled “Cut-out Animation as a Technic and Development Inside History Process” will embrace the art of paper cutting, the relationship between the art of paper cutting and cut-out animation, its development within the historical process, animation artists producing artworks in this field, important cut-out animations, and their technical properties.

Keywords: cut-out, paper art, animation, technic

Procedia PDF Downloads 249
6224 Markowitz and Implementation of a Multi-Objective Evolutionary Technique Applied to the Colombia Stock Exchange (2009-2015)

Authors: Feijoo E. Colomine Duran, Carlos E. Peñaloza Corredor

Abstract:

There modeling component selection financial investment (Portfolio) a variety of problems that can be addressed with optimization techniques under evolutionary schemes. For his feature, the problem of selection of investment components of a dichotomous relationship between two elements that are opposed: The Portfolio Performance and Risk presented by choosing it. This relationship was modeled by Markowitz through a media problem (Performance) - variance (risk), ie must Maximize Performance and Minimize Risk. This research included the study and implementation of multi-objective evolutionary techniques to solve these problems, taking as experimental framework financial market equities Colombia Stock Exchange between 2009-2015. Comparisons three multiobjective evolutionary algorithms, namely the Nondominated Sorting Genetic Algorithm II (NSGA-II), the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and Indicator-Based Selection in Multiobjective Search (IBEA) were performed using two measures well known performance: The Hypervolume indicator and R_2 indicator, also it became a nonparametric statistical analysis and the Wilcoxon rank-sum test. The comparative analysis also includes an evaluation of the financial efficiency of the investment portfolio chosen by the implementation of various algorithms through the Sharpe ratio. It is shown that the portfolio provided by the implementation of the algorithms mentioned above is very well located between the different stock indices provided by the Colombia Stock Exchange.

Keywords: finance, optimization, portfolio, Markowitz, evolutionary algorithms

Procedia PDF Downloads 275
6223 Incremental Learning of Independent Topic Analysis

Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda

Abstract:

In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.

Keywords: text mining, topic extraction, independent, incremental, independent component analysis

Procedia PDF Downloads 286
6222 Management of Urban Watering: A Study of Appliance of Technologies and Legislation in Goiania, Brazil

Authors: Vinicius Marzall, Jussanã Milograna

Abstract:

The urban drainwatering remains a major challenge for most of the Brazilian cities. Not so different of the most part, Goiania, a state capital located in Midwest of the country has few legislations about the subject matter and only one registered solution of compensative techniques for drainwater. This paper clam to show some solutions which are adopted in other Brazilian cities with consolidated legislation, suggesting technics about detention tanks in a building sit. This study analyzed and compared the legislation of Curitiba, Porto Alegre e Sao Paulo, with the actual legislation and politics of Goiania. After this, were created models with adopted data for dimensioning the size of detention tanks using the envelope curve method considering synthetic series for intense precipitations and building sits between 250 m² and 600 m², with an impermeabilization tax of 50%. The results showed great differences between the legislation of Goiania and the documentation of the others cities analyzed, like the number of techniques for drainwatering applied to the reality of the cities, educational actions to awareness the population about care the water courses and political management by having a specified funds for drainwater subjects, for example. Besides, the use of detention tank showed itself practicable, have seen that the occupation of the tank is minor than 3% of the building sit, whatever the size of the terrain, granting the exit flow to pre-occupational taxes in extreme rainfall events. Also, was developed a linear equation to measure the detention tank based in the size of the building sit in Goiania, making simpler the calculation and implementation for non-specialized people.

Keywords: clean technology, legislation, rainwater management, urban drainwater

Procedia PDF Downloads 136
6221 Biodiesel Production and Heavy Metal Removal by Aspergillus fumigatus sp.

Authors: Ahmed M. Haddad, Hadeel S. El-Shaal, Gadallah M. Abu-Elreesh

Abstract:

Some of filamentous fungi can be used for biodiesel production as they are able to accumulate high amounts of intracellular lipids when grown at stress conditions. Aspergillus fumigatus sp. was isolated from Nile delta soil in Egypt. The fungus was primarily screened for its capacity to accumulate lipids using Nile red staining assay. The fungus could accumulate more than 20% of its biomass as lipids when grown at optimized minimal medium. After lipid extraction, we could use fungal cell debris to remove some heavy metals from contaminated waste water. The fungal cell debris could remove Cd, Cr, and Zn with absorption efficiency of 73%, 83.43%, and 69.39% respectively. In conclusion, the Aspergillus fumigatus isolate may be considered as a promising biodiesel producer, and its biomass waste can be further used for bioremediation of wastewater contaminated with heavy metals.

Keywords: biodiesel, bioremediation, fungi, heavy metals, lipids, oleaginous

Procedia PDF Downloads 204
6220 Visual Inspection of Road Conditions Using Deep Convolutional Neural Networks

Authors: Christos Theoharatos, Dimitris Tsourounis, Spiros Oikonomou, Andreas Makedonas

Abstract:

This paper focuses on the problem of visually inspecting and recognizing the road conditions in front of moving vehicles, targeting automotive scenarios. The goal of road inspection is to identify whether the road is slippery or not, as well as to detect possible anomalies on the road surface like potholes or body bumps/humps. Our work is based on an artificial intelligence methodology for real-time monitoring of road conditions in autonomous driving scenarios, using state-of-the-art deep convolutional neural network (CNN) techniques. Initially, the road and ego lane are segmented within the field of view of the camera that is integrated into the front part of the vehicle. A novel classification CNN is utilized to identify among plain and slippery road textures (e.g., wet, snow, etc.). Simultaneously, a robust detection CNN identifies severe surface anomalies within the ego lane, such as potholes and speed bumps/humps, within a distance of 5 to 25 meters. The overall methodology is illustrated under the scope of an integrated application (or system), which can be integrated into complete Advanced Driver-Assistance Systems (ADAS) systems that provide a full range of functionalities. The outcome of the proposed techniques present state-of-the-art detection and classification results and real-time performance running on AI accelerator devices like Intel’s Myriad 2/X Vision Processing Unit (VPU).

Keywords: deep learning, convolutional neural networks, road condition classification, embedded systems

Procedia PDF Downloads 107
6219 A Supervised Face Parts Labeling Framework

Authors: Khalil Khan, Ikram Syed, Muhammad Ehsan Mazhar, Iran Uddin, Nasir Ahmad

Abstract:

Face parts labeling is the process of assigning class labels to each face part. A face parts labeling method (FPL) which divides a given image into its constitutes parts is proposed in this paper. A database FaceD consisting of 564 images is labeled with hand and make publically available. A supervised learning model is built through extraction of features from the training data. The testing phase is performed with two semantic segmentation methods, i.e., pixel and super-pixel based segmentation. In pixel-based segmentation class label is provided to each pixel individually. In super-pixel based method class label is assigned to super-pixel only – as a result, the same class label is given to all pixels inside a super-pixel. Pixel labeling accuracy reported with pixel and super-pixel based methods is 97.68 % and 93.45% respectively.

Keywords: face labeling, semantic segmentation, classification, face segmentation

Procedia PDF Downloads 235
6218 Pricing Strategy in Marketing: Balancing Value and Profitability

Authors: Mohsen Akhlaghi, Tahereh Ebrahimi

Abstract:

Pricing strategy is a vital component in achieving the balance between customer value and business profitability. The aim of this study is to provide insights into the factors, techniques, and approaches involved in pricing decisions. The study utilizes a descriptive approach to discuss various aspects of pricing strategy in marketing, drawing on concepts from market research, consumer psychology, competitive analysis, and adaptability. This approach presents a comprehensive view of pricing decisions. The result of this exploration is a framework that highlights key factors influencing pricing decisions. The study examines how factors such as market positioning, product differentiation, and brand image shape pricing strategies. Additionally, it emphasizes the role of consumer psychology in understanding price elasticity, perceived value, and price-quality associations that influence consumer behavior. Various pricing techniques, including charm pricing, prestige pricing, and bundle pricing, are mentioned as methods to enhance sales by influencing consumer perceptions. The study also underscores the importance of adaptability in responding to market dynamics through regular price monitoring, dynamic pricing, and promotional strategies. It recognizes the role of digital platforms in enabling personalized pricing and dynamic pricing models. In conclusion, the study emphasizes that effective pricing strategies strike a balance between customer value and business profitability, ultimately driving sales, enhancing brand perception, and fostering lasting customer relationships.

Keywords: business, customer benefits, marketing, pricing

Procedia PDF Downloads 51
6217 Hiveopolis - Honey Harvester System

Authors: Erol Bayraktarov, Asya Ilgun, Thomas Schickl, Alexandre Campo, Nicolis Stamatios

Abstract:

Traditional means of harvesting honey are often stressful for honeybees. Each time honey is collected a portion of the colony can die. In consequence, the colonies’ resilience to environmental stressors will decrease and this ultimately contributes to the global problem of honeybee colony losses. As part of the project HIVEOPOLIS, we design and build a different kind of beehive, incorporating technology to reduce negative impacts of beekeeping procedures, including honey harvesting. A first step in maintaining more sustainable honey harvesting practices is to design honey storage frames that can automate the honey collection procedures. This way, beekeepers save time, money, and labor by not having to open the hive and remove frames, and the honeybees' nest stays undisturbed.This system shows promising features, e.g., high reliability which could be a key advantage compared to current honey harvesting technologies.Our original concept of fractional honey harvesting has been to encourage the removal of honey only from "safe" locations and at levels that would leave the bees enough high-nutritional-value honey. In this abstract, we describe the current state of our honey harvester, its technology and areas to improve. The honey harvester works by separating the honeycomb cells away from the comb foundation; the movement and the elastic nature of honey supports this functionality. The honey sticks to the foundation, because of the surface tension forces amplified by the geometry. In the future, by monitoring the weight and therefore the capped honey cells on our honey harvester frames, we will be able to remove honey as soon as the weight measuring system reports that the comb is ready for harvesting. Higher viscosity honey or crystalized honey cause challenges in temperate locations when a smooth flow of honey is required. We use resistive heaters to soften the propolis and wax to unglue the moving parts during extraction. These heaters can also melt the honey slightly to the needed flow state. Precise control of these heaters allows us to operate the device for several purposes. We use ‘Nitinol’ springs that are activated by heat as an actuation method. Unlike conventional stepper or servo motors, which we also evaluated throughout development, the springs and heaters take up less space and reduce the overall system complexity. Honeybee acceptance was unknown until we actually inserted a device inside a hive. We not only observed bees walking on the artificial comb but also building wax, filling gaps with propolis and storing honey. This also shows that bees don’t mind living in spaces and hives built from 3D printed materials. We do not have data yet to prove that the plastic materials do not affect the chemical composition of the honey. We succeeded in automatically extracting stored honey from the device, demonstrating a useful extraction flow and overall effective operation this way.

Keywords: honey harvesting, honeybee, hiveopolis, nitinol

Procedia PDF Downloads 89