Search results for: VOF (volume of fluid method)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21922

Search results for: VOF (volume of fluid method)

14032 Ultrasound/Microwave Assisted Extraction Recovery and Identification of Bioactive Compounds (Polyphenols) from Tarbush (Fluorensia cernua)

Authors: Marisol Rodriguez-Duarte, Aide Saenz-Galindo, Carolina Flores-Gallegos, Raul Rodriguez-Herrera, Juan Ascacio-Valdes

Abstract:

The plant known as tarbush (Fluorensia cernua) is a plant originating in northern Mexico, mainly in the states of Coahuila, Durango, San Luis Potosí, Zacatecas and Chihuahua. It is a branched shrub that belongs to the family Asteraceae, has oval leaves of 6 to 11 cm in length and also has small yellow flowers. In Mexico, the tarbush is a very appreciated plant because it has been used as a traditional medicinal agent, for the treatment of gastrointestinal diseases, skin infections and as a healing agent. This plant has been used mainly as an infusion. Due to its traditional use, the content and type of phytochemicals present in the plant are currently unknown and are responsible for its biological properties, so its recovery and identification is very important because the compounds that it contains have relevant applications in the field of food, pharmaceuticals and medicine. The objective of this work was to determine the best extraction condition of phytochemical compounds (mainly polyphenolic compounds) from the leaf using ultrasound/microwave assisted extraction (U/M-AE). To reach the objective, U/M-AE extractions were performed evaluating three mass/volume ratios (1:8, 1:12, 1:16), three ethanol/water solvent concentrations (0%, 30% and 70%), ultrasound extraction time of 20 min and 5 min at 70°C of microwave treatment. All experiments were performed using a fractional factorial experimental design. Once the best extraction condition was defined, the compounds were recovered by liquid column chromatography using Amberlite XAD-16, the polyphenolic fraction was recovered with ethanol and then evaporated. The recovered polyphenolic compounds were quantified by spectrophotometric techniques and identified by HPLC/ESI/MS. The results obtained showed that the best extraction condition of the compounds was using a mass/volume ratio of 1:8 and solvent ethanol/water concentration of 70%. The concentration obtained from polyphenolic compounds using this condition was 22.74 mg/g and finally, 16 compounds of polyphenolic origin were identified. The results obtained in this work allow us to postulate the Mexican plant known as tarbush as a relevant source of bioactive polyphenolic compounds of food, pharmaceutical and medicinal interest.

Keywords: U/M-AE, tarbush, polyphenols, identification

Procedia PDF Downloads 159
14031 Parametric Appraisal of Robotic Arc Welding of Mild Steel Material by Principal Component Analysis-Fuzzy with Taguchi Technique

Authors: Amruta Rout, Golak Bihari Mahanta, Gunji Bala Murali, Bibhuti Bhusan Biswal, B. B. V. L. Deepak

Abstract:

The use of industrial robots for performing welding operation is one of the chief sign of contemporary welding in these days. The weld joint parameter and weld process parameter modeling is one of the most crucial aspects of robotic welding. As weld process parameters affect the weld joint parameters differently, a multi-objective optimization technique has to be utilized to obtain optimal setting of weld process parameter. In this paper, a hybrid optimization technique, i.e., Principal Component Analysis (PCA) combined with fuzzy logic has been proposed to get optimal setting of weld process parameters like wire feed rate, welding current. Gas flow rate, welding speed and nozzle tip to plate distance. The weld joint parameters considered for optimization are the depth of penetration, yield strength, and ultimate strength. PCA is a very efficient multi-objective technique for converting the correlated and dependent parameters into uncorrelated and independent variables like the weld joint parameters. Also in this approach, no need for checking the correlation among responses as no individual weight has been assigned to responses. Fuzzy Inference Engine can efficiently consider these aspects into an internal hierarchy of it thereby overcoming various limitations of existing optimization approaches. At last Taguchi method is used to get the optimal setting of weld process parameters. Therefore, it has been concluded the hybrid technique has its own advantages which can be used for quality improvement in industrial applications.

Keywords: robotic arc welding, weld process parameters, weld joint parameters, principal component analysis, fuzzy logic, Taguchi method

Procedia PDF Downloads 175
14030 Management of the Experts in the Research Evaluation System of the University: Based on National Research University Higher School of Economics Example

Authors: Alena Nesterenko, Svetlana Petrikova

Abstract:

Research evaluation is one of the most important elements of self-regulation and development of researchers as it is impartial and independent process of assessment. The method of expert evaluations as a scientific instrument solving complicated non-formalized problems is firstly a scientifically sound way to conduct the assessment which maximum effectiveness of work at every step and secondly the usage of quantitative methods for evaluation, assessment of expert opinion and collective processing of the results. These two features distinguish the method of expert evaluations from long-known expertise widespread in many areas of knowledge. Different typical problems require different types of expert evaluations methods. Several issues which arise with these methods are experts’ selection, management of assessment procedure, proceeding of the results and remuneration for the experts. To address these issues an on-line system was created with the primary purpose of development of a versatile application for many workgroups with matching approaches to scientific work management. Online documentation assessment and statistics system allows: - To realize within one platform independent activities of different workgroups (e.g. expert officers, managers). - To establish different workspaces for corresponding workgroups where custom users database can be created according to particular needs. - To form for each workgroup required output documents. - To configure information gathering for each workgroup (forms of assessment, tests, inventories). - To create and operate personal databases of remote users. - To set up automatic notification through e-mail. The next stage is development of quantitative and qualitative criteria to form a database of experts. The inventory was made so that the experts may not only submit their personal data, place of work and scientific degree but also keywords according to their expertise, academic interests, ORCID, Researcher ID, SPIN-code RSCI, Scopus AuthorID, knowledge of languages, primary scientific publications. For each project, competition assessments are processed in accordance to ordering party demands in forms of apprised inventories, commentaries (50-250 characters) and overall review (1500 characters) in which expert states the absence of conflict of interest. Evaluation is conducted as follows: as applications are added to database expert officer selects experts, generally, two persons per application. Experts are selected according to the keywords; this method proved to be good unlike the OECD classifier. The last stage: the choice of the experts is approved by the supervisor, the e-mails are sent to the experts with invitation to assess the project. An expert supervisor is controlling experts writing reports for all formalities to be in place (time-frame, propriety, correspondence). If the difference in assessment exceeds four points, the third evaluation is appointed. As the expert finishes work on his expert opinion, system shows contract marked ‘new’, managers commence with the contract and the expert gets e-mail that the contract is formed and ready to be signed. All formalities are concluded and the expert gets remuneration for his work. The specificity of interaction of the examination officer with other experts will be presented in the report.

Keywords: expertise, management of research evaluation, method of expert evaluations, research evaluation

Procedia PDF Downloads 203
14029 Lightweight Sheet Molding Compound Composites by Coating Glass Fiber with Cellulose Nanocrystals

Authors: Amir Asadi, Karim Habib, Robert J. Moon, Kyriaki Kalaitzidou

Abstract:

There has been considerable interest in cellulose nanomaterials (CN) as polymer and polymer composites reinforcement due to their high specific modulus and strength, low density and toxicity, and accessible hydroxyl side groups that can be readily chemically modified. The focus of this study is making lightweight composites for better fuel efficiency and lower CO2 emission in auto industries with no compromise on mechanical performance using a scalable technique that can be easily integrated in sheet molding compound (SMC) manufacturing lines. Light weighting will be achieved by replacing part of the heavier components, i.e. glass fibers (GF), with a small amount of cellulose nanocrystals (CNC) in short GF/epoxy composites made using SMC. CNC will be introduced as coating of the GF rovings prior to their use in the SMC line. The employed coating method is similar to the fiber sizing technique commonly used and thus it can be easily scaled and integrated to industrial SMC lines. This will be an alternative route to the most techniques that involve dispersing CN in polymer matrix, in which the nanomaterials agglomeration limits the capability for scaling up in an industrial production. We have demonstrated that incorporating CNC as a coating on GF surface by immersing the GF in CNC aqueous suspensions, a simple and scalable technique, increases the interfacial shear strength (IFSS) by ~69% compared to the composites produced by uncoated GF, suggesting an enhancement of stress transfer across the GF/matrix interface. As a result of IFSS enhancement, incorporation of 0.17 wt% CNC in the composite results in increases of ~10% in both elastic modulus and tensile strength, and 40 % and 43 % in flexural modulus and strength respectively. We have also determined that dispersing 1.4 and 2 wt% CNC in the epoxy matrix of short GF/epoxy SMC composites by sonication allows removing 10 wt% GF with no penalty on tensile and flexural properties leading to 7.5% lighter composites. Although sonication is a scalable technique, it is not quite as simple and inexpensive as coating the GF by passing through an aqueous suspension of CNC. In this study, the above findings are integrated to 1) investigate the effect of CNC content on mechanical properties by passing the GF rovings through CNC aqueous suspension with various concentrations (0-5%) and 2) determine the optimum ratio of the added CNC to the removed GF to achieve the maximum possible weight reduction with no cost on mechanical performance of the SMC composites. The results of this study are of industrial relevance, providing a path toward producing high volume lightweight and mechanically enhanced SMC composites using cellulose nanomaterials.

Keywords: cellulose nanocrystals, light weight polymer-matrix composites, mechanical properties, sheet molding compound (SMC)

Procedia PDF Downloads 221
14028 Important Factors Affecting the Effectiveness of Quality Control Circles

Authors: Sogol Zarafshan

Abstract:

The present study aimed to identify important factors affecting the effectiveness of quality control circles in a hospital, as well as rank them using a combination of fuzzy VIKOR and Grey Relational Analysis (GRA). The study population consisted of five academic members and five experts in the field of nursing working in a hospital, who were selected using a purposive sampling method. Also, a sample of 107 nurses was selected through a simple random sampling method using their employee codes and the random-number table. The required data were collected using a researcher-made questionnaire which consisted of 12 factors. The validity of this questionnaire was confirmed through giving the opinions of experts and academic members who participated in the present study, as well as performing confirmatory factor analysis. Its reliability also was verified (α=0.796). The collected data were analyzed using SPSS 22.0 and LISREL 8.8, as well as VIKOR–GRA and IPA methods. The results of ranking the factors affecting the effectiveness of quality control circles showed that the highest and lowest ranks were related to ‘Managers’ and supervisors’ support’ and ‘Group leadership’. Also, the highest hospital performance was for factors such as ‘Clear goals and objectives’ and ‘Group cohesiveness and homogeneity’, and the lowest for ‘Reward system’ and ‘Feedback system’, respectively. The results showed that although ‘Training the members’, ‘Using the right tools’ and ‘Reward system’ were factors that were of great importance, the organization’s performance for these factors was poor. Therefore, these factors should be paid more attention by the studied hospital managers and should be improved as soon as possible.

Keywords: Quality control circles, Fuzzy VIKOR, Grey Relational Analysis, Importance–Performance Analysis

Procedia PDF Downloads 133
14027 Delamination Fracture Toughness Benefits of Inter-Woven Plies in Composite Laminates Produced through Automated Fibre Placement

Authors: Jayden Levy, Garth M. K. Pearce

Abstract:

An automated fibre placement method has been developed to build through-thickness reinforcement into carbon fibre reinforced plastic laminates during their production, with the goal of increasing delamination fracture toughness while circumventing the additional costs and defects imposed by post-layup stitching and z-pinning. Termed ‘inter-weaving’, the method uses custom placement sequences of thermoset prepreg tows to distribute regular fibre link regions in traditionally clean ply interfaces. Inter-weaving’s impact on mode I delamination fracture toughness was evaluated experimentally through double cantilever beam tests (ASTM standard D5528-13) on [±15°]9 laminates made from Park Electrochemical Corp. E-752-LT 1/4” carbon fibre prepreg tape. Unwoven and inter-woven automated fibre placement samples were compared to those of traditional laminates produced from standard uni-directional plies of the same material system. Unwoven automated fibre placement laminates were found to suffer a mostly constant 3.5% decrease in mode I delamination fracture toughness compared to flat uni-directional plies. Inter-weaving caused significant local fracture toughness increases (up to 50%), though these were offset by a matching overall reduction. These positive and negative behaviours of inter-woven laminates were respectively found to be caused by fibre breakage and matrix deformation at inter-weave sites, and the 3D layering of inter-woven ply interfaces providing numerous paths of least resistance for crack propagation.

Keywords: AFP, automated fibre placement, delamination, fracture toughness, inter-weaving

Procedia PDF Downloads 181
14026 Regret-Regression for Multi-Armed Bandit Problem

Authors: Deyadeen Ali Alshibani

Abstract:

In the literature, the multi-armed bandit problem as a statistical decision model of an agent trying to optimize his decisions while improving his information at the same time. There are several different algorithms models and their applications on this problem. In this paper, we evaluate the Regret-regression through comparing with Q-learning method. A simulation on determination of optimal treatment regime is presented in detail.

Keywords: optimal, bandit problem, optimization, dynamic programming

Procedia PDF Downloads 450
14025 Optimum Dewatering Network Design Using Firefly Optimization Algorithm

Authors: S. M. Javad Davoodi, Mojtaba Shourian

Abstract:

Groundwater table close to the ground surface causes major problems in construction and mining operation. One of the methods to control groundwater in such cases is using pumping wells. These pumping wells remove excess water from the site project and lower the water table to a desirable value. Although the efficiency of this method is acceptable, it needs high expenses to apply. It means even small improvement in a design of pumping wells can lead to substantial cost savings. In order to minimize the total cost in the method of pumping wells, a simulation-optimization approach is applied. The proposed model integrates MODFLOW as the simulation model with Firefly as the optimization algorithm. In fact, MODFLOW computes the drawdown due to pumping in an aquifer and the Firefly algorithm defines the optimum value of design parameters which are numbers, pumping rates and layout of the designing wells. The developed Firefly-MODFLOW model is applied to minimize the cost of the dewatering project for the ancient mosque of Kerman city in Iran. Repetitive runs of the Firefly-MODFLOW model indicates that drilling two wells with the total rate of pumping 5503 m3/day is the result of the minimization problem. Results show that implementing the proposed solution leads to at least 1.5 m drawdown in the aquifer beneath mosque region. Also, the subsidence due to groundwater depletion is less than 80 mm. Sensitivity analyses indicate that desirable groundwater depletion has an enormous impact on total cost of the project. Besides, in a hypothetical aquifer decreasing the hydraulic conductivity contributes to decrease in total water extraction for dewatering.

Keywords: groundwater dewatering, pumping wells, simulation-optimization, MODFLOW, firefly algorithm

Procedia PDF Downloads 292
14024 Some Extreme Halophilic Microorganisms Produce Extracellular Proteases with Long Lasting Tolerance to Ethanol Exposition

Authors: Cynthia G. Esquerre, Amparo Iris Zavaleta

Abstract:

Extremophiles constitute a potentially valuable source of proteases for the development of biotechnological processes; however, the number of available studies in the literature is limited compared to mesophilic counterparts. Therefore, in this study, Peruvian halophilic microorganisms were characterized to select suitable proteolytic strains that produce active proteases under exigent conditions. Proteolysis was screened using the streak plate method with gelatin or skim milk as substrates. After that, proteolytic microorganisms were selected for phenotypic characterization and screened by a semi-quantitative proteolytic test using a modified method of diffusion agar. Finally, proteolysis was evaluated using partially purified extracts by ice-cold ethanol precipitation and dialysis. All analyses were carried out over a wide range of NaCl concentrations, pH, temperature and substrates. Of a total of 60 strains, 21 proteolytic strains were selected, of these 19 were extreme halophiles and 2 were moderates. Most proteolytic strains demonstrated differences in their biochemical patterns, particularly in sugar fermentation. A total of 14 microorganisms produced extracellular proteases, 13 were neutral, and one was alkaline showing activity up to pH 9.0. Proteases hydrolyzed gelatin as the most specific substrate. In general, catalytic activity was efficient under a wide range of NaCl (1 to 4 M NaCl), temperature (37 to 55 °C) and after an ethanol exposition performed at -20 °C for 24 hours. In conclusion, this study reported 14 candidates extremely halophiles producing extracellular proteases capable of being stable and active on a wide range of NaCl, temperature and even long lasting ethanol exposition.

Keywords: biotechnological processes, ethanol exposition, extracellular proteases, extremophiles

Procedia PDF Downloads 280
14023 Two Component Source Apportionment Based on Absorption and Size Distribution Measurement

Authors: Tibor Ajtai, Noémi Utry, Máté Pintér, Gábor Szabó, Zoltán Bozóki

Abstract:

Beyond its climate and health related issues ambient light absorbing carbonaceous particulate matter (LAC) has also become a great scientific interest in terms of its regulations recently. It has been experimentally demonstrated in recent studies, that LAC is dominantly composed of traffic and wood burning aerosol particularly under wintertime urban conditions, when the photochemical and biological activities are negligible. Several methods have been introduced to quantitatively apportion aerosol fractions emitted by wood burning and traffic but most of them require costly and time consuming off-line chemical analysis. As opposed to chemical features, the microphysical properties of airborne particles such as optical absorption and size distribution can be easily measured on-line, with high accuracy and sensitivity, especially under highly polluted urban conditions. Recently a new method has been proposed for the apportionment of wood burning and traffic aerosols based on the spectral dependence of their absorption quantified by the Aerosol Angström Exponent (AAE). In this approach the absorption coefficient is deduced from transmission measurement on a filter accumulated aerosol sample and the conversion factor between the measured optical absorption and the corresponding mass concentration (the specific absorption cross section) are determined by on-site chemical analysis. The recently developed multi-wavelength photoacoustic instruments provide novel, in-situ approach towards the reliable and quantitative characterization of carbonaceous particulate matter. Therefore, it also opens up novel possibilities on the source apportionment through the measurement of light absorption. In this study, we demonstrate an in-situ spectral characterization method of the ambient carbon fraction based on light absorption and size distribution measurements using our state-of-the-art multi-wavelength photoacoustic instrument (4λ-PAS) and Single Mobility Particle Sizer (SMPS) The carbonaceous particulate selective source apportionment study was performed for ambient particulate matter in the city center of Szeged, Hungary where the dominance of traffic and wood burning aerosol has been experimentally demonstrated earlier. The proposed model is based on the parallel, in-situ measurement of optical absorption and size distribution. AAEff and AAEwb were deduced from the measured data using the defined correlation between the AOC(1064nm)/AOC(266nm) and N100/N20 ratios. σff(λ) and σwb(λ) were determined with the help of the independently measured temporal mass concentrations in the PM1 mode. Furthermore, the proposed optical source apportionment is based on the assumption that the light absorbing fraction of PM is exclusively related to traffic and wood burning. This assumption is indirectly confirmed here by the fact that the measured size distribution is composed of two unimodal size distributions identified to correspond to traffic and wood burning aerosols. The method offers the possibility of replacing laborious chemical analysis with simple in-situ measurement of aerosol size distribution data. The results by the proposed novel optical absorption based source apportionment method prove its applicability whenever measurements are performed at an urban site where traffic and wood burning are the dominant carbonaceous sources of emission.

Keywords: absorption, size distribution, source apportionment, wood burning, traffic aerosol

Procedia PDF Downloads 223
14022 Most Recent Lifespan Estimate for the Itaipu Hydroelectric Power Plant Computed by Using Borland and Miller Method and Mass Balance in Brazil, Paraguay

Authors: Anderson Braga Mendes

Abstract:

Itaipu Hydroelectric Power Plant is settled on the Paraná River, which is a natural boundary between Brazil and Paraguay; thus, the facility is shared by both countries. Itaipu Power Plant is the biggest hydroelectric generator in the world, and provides clean and renewable electrical energy supply for 17% and 76% of Brazil and Paraguay, respectively. The plant started its generation in 1984. It counts on 20 Francis turbines and has installed capacity of 14,000 MWh. Its historic generation record occurred in 2016 (103,098,366 MWh), and since the beginning of its operation until the last day of 2016 the plant has achieved the sum of 2,415,789,823 MWh. The distinct sedimentologic aspects of the drainage area of Itaipu Power Plant, from its stretch upstream (Porto Primavera and Rosana dams) to downstream (Itaipu dam itself), were taken into account in order to best estimate the increase/decrease in the sediment yield by using data from 2001 to 2016. Such data are collected through a network of 14 automatic sedimentometric stations managed by the company itself and operating in an hourly basis, covering an area of around 136,000 km² (92% of the incremental drainage area of the undertaking). Since 1972, a series of lifespan studies for the Itaipu Power Plant have been made, being first assessed by Sir Hans Albert Einstein, at the time of the feasibility studies for the enterprise. From that date onwards, eight further studies were made through the last 44 years aiming to confer more precision upon the estimates based on more updated data sets. From the analysis of each monitoring station, it was clearly noticed strong increase tendencies in the sediment yield through the last 14 years, mainly in the Iguatemi, Ivaí, São Francisco Falso and Carapá Rivers, the latter situated in Paraguay, whereas the others are utterly in Brazilian territory. Five lifespan scenarios considering different sediment yield tendencies were simulated with the aid of the softwares SEDIMENT and DPOSIT, both developed by the author of the present work. Such softwares thoroughly follow the Borland & Miller methodology (empirical method of area-reduction). The soundest scenario out of the five ones under analysis indicated a lifespan foresight of 168 years, being the reservoir only 1.8% silted by the end of 2016, after 32 years of operation. Besides, the mass balance in the reservoir (water inflows minus outflows) between 1986 and 2016 shows that 2% of the whole Itaipu lake is silted nowadays. Owing to the convergence of both results, which were acquired by using different methodologies and independent input data, it is worth concluding that the mathematical modeling is satisfactory and calibrated, thus assigning credibility to this most recent lifespan estimate.

Keywords: Borland and Miller method, hydroelectricity, Itaipu Power Plant, lifespan, mass balance

Procedia PDF Downloads 272
14021 Co-pyrolysis of Sludge and Kaolin/Zeolite to Stabilize Heavy Metals

Authors: Qian Li, Zhaoping Zhong

Abstract:

Sewage sludge, a typical solid waste, has inevitably been produced in enormous quantities in China. Still worse, the amount of sewage sludge produced has been increasing due to rapid economic development and urbanization. Compared to the conventional method to treat sewage sludge, pyrolysis has been considered an economic and ecological technology because it can significantly reduce the sludge volume, completely kill pathogens, and produce valuable solid, gas, and liquid products. However, the large-scale utilization of sludge biochar has been limited due to the considerable risk posed by heavy metals in the sludge. Heavy metals enriched in pyrolytic biochar could be divided into exchangeable, reducible, oxidizable, and residual forms. The residual form of heavy metals is the most stable and cannot be used by organisms. Kaolin and zeolite are environmentally friendly inorganic minerals with a high surface area and heat resistance characteristics. So, they exhibit the enormous potential to immobilize heavy metals. In order to reduce the risk of leaching heavy metals in the pyrolysis biochar, this study pyrolyzed sewage sludge mixed with kaolin/zeolite in a small rotary kiln. The influences of additives and pyrolysis temperature on the leaching concentration and morphological transformation of heavy metals in pyrolysis biochar were investigated. The potential mechanism of stabilizing heavy metals in the co-pyrolysis of sludge blended with kaolin/zeolite was explained by scanning electron microscopy, X-ray diffraction, and specific surface area and porosity analysis. The European Community Bureau of Reference sequential extraction procedure has been applied to analyze the forms of heavy metals in sludge and pyrolysis biochar. All the concentrations of heavy metals were examined by flame atomic absorption spectrophotometry. Compared with the proportions of heavy metals associated with the F4 fraction in pyrolytic carbon prepared without additional agents, those in carbon obtained by co-pyrolysis of sludge and kaolin/zeolite increased. Increasing the additive dosage could improve the proportions of the stable fraction of various heavy metals in biochar. Kaolin exhibited a better effect on stabilizing heavy metals than zeolite. Aluminosilicate additives with excellent adsorption performance could capture more released heavy metals during sludge pyrolysis. Then heavy metal ions would react with the oxygen ions of additives to form silicate and aluminate, causing the conversion of heavy metals from unstable fractions (sulfate, chloride, etc.) to stable fractions (silicate, aluminate, etc.). This study reveals that the efficiency of stabilizing heavy metals depends on the formation of stable mineral compounds containing heavy metals in pyrolysis biochar.

Keywords: co-pyrolysis, heavy metals, immobilization mechanism, sewage sludge

Procedia PDF Downloads 63
14020 Reaching a Mobile and Dynamic Nose after Rhinoplasty: A Pilot Study

Authors: Guncel Ozturk

Abstract:

Background: Rhinoplasty is the most commonly performed cosmetic operations in plastic surgery. Maneuvers used in rhinoplasty lead to a firm and stiff nasal tip in the early postoperative months. This unnatural stability of the nose may easily cause distortion in the reshaped nose after severe trauma. Moreover, a firm nasal tip may cause difficulties in performing activities such as touching, hugging, or kissing. Decreasing the stability and increasing the mobility of the nasal tip would help rhinoplasty patients to avoid these small but relatively important problems. Methods: We use delivery approach with closed rhinoplasty and changed positions of intranasal incisions to reach a dynamic and mobile nose. A total of 203 patients who had undergone primary closed rhinoplasty in private practice were inspected retrospectively. Posterior strut flap that was connected with connective tissues in the caudal of septum and the medial crurals were formed. Cartilage of the posterior strut graft was left 2 mm thick in the distal part of septum, it was cut vertically, and the connective tissue in the distal part was preserved. Results: The median patient age was 24 (range 17-42) years. The median follow-up period was15.2 (range12-26) months. Patient satisfaction was assessed with the 'Rhinoplasty Outcome Evaluation' (ROE) questionnaire. Twelve months after surgeries, 87.5% of patients reported excellent outcomes, according to ROE. Conclusion: The soft tissue connections between that segment and surrounding structures should be preserved to save the support of the tip while having a mobile tip at the same time with this method. These modifications would access to a mobile, non-stiff, and dynamic nasal tip in the early postoperative months. Further and prospective studies should be performed for supporting this method.

Keywords: closed rhinoplasty, dynamic, mobile, tip

Procedia PDF Downloads 129
14019 Candida antartica Lipase Assisted Enrichment of n-3 PUFA in Indian Sardine Oil

Authors: Prasanna Belur, P. R. Ashwini, Sampath Charanyaa, I. Regupathi

Abstract:

Indian oil sardine (Sardinella longiceps) are one of the richest and cheapest sources of n-3 polyunsaturated fatty acids (n-3 PUFA) such as Eicosapentaenoic acid (EPA) and Docosahexaenoic acid (DHA). The health benefits conferred by n-3 PUFA upon consumption, in the prevention and treatment of coronary, neuromuscular, immunological disorders and allergic conditions are well documented. Natural refined Indian Sardine oil generally contain about 25% (w/w) n-3 PUFA along with various unsaturated and saturated fatty acids in the form of mono, di, and triglycerides. Having high concentration of n-3 PUFA content in the glyceride form is most desirable for human consumption to avail maximum health benefits. Thus, enhancing the n-3 PUFA content while retaining it in the glyceride form with green technology is the need of the hour. In this study, refined Indian Sardine oil was subjected to selective hydrolysis by Candida antartica lipase to enhance n-3 PUFA content. The degree of hydrolysis and enhancement of n-3 PUFA content was estimated by determining acid value, Iodine value, EPA and DHA content (by Gas Chromatographic methods after derivitization) before and after hydrolysis. Various reaction parameters such as pH, temperature, enzyme load, lipid to aqueous phase volume ratio and incubation time were optimized by conducting trials with one parameter at a time approach. Incubating enzyme solution with refined sardine oil with a volume ratio of 1:1, at pH 7.0, for 60 minutes at 50 °C, with an enzyme load of 60 mg/ml was found to be optimum. After enzymatic treatment, the oil was subjected to refining to remove free fatty acids and moisture content using previously optimized refining technology. Enzymatic treatment at the optimal conditions resulted in 12.11 % enhancement in Degree of hydrolysis. Iodine number had increased by 9.7 % and n-3 PUFA content was enhanced by 112 % (w/w). Selective enhancement of n-3 PUFA glycerides, eliminating saturated and unsaturated fatty acids from the oil using enzyme is an interesting preposition as this technique is environment-friendly, cost effective and provide natural source of n-3 PUFA rich oil.

Keywords: Candida antartica, lipase, n-3 polyunsaturated fatty acids, sardine oil

Procedia PDF Downloads 221
14018 Crop Leaf Area Index (LAI) Inversion and Scale Effect Analysis from Unmanned Aerial Vehicle (UAV)-Based Hyperspectral Data

Authors: Xiaohua Zhu, Lingling Ma, Yongguang Zhao

Abstract:

Leaf Area Index (LAI) is a key structural characteristic of crops and plays a significant role in precision agricultural management and farmland ecosystem modeling. However, LAI retrieved from different resolution data contain a scaling bias due to the spatial heterogeneity and model non-linearity, that is, there is scale effect during multi-scale LAI estimate. In this article, a typical farmland in semi-arid regions of Chinese Inner Mongolia is taken as the study area, based on the combination of PROSPECT model and SAIL model, a multiple dimensional Look-Up-Table (LUT) is generated for multiple crops LAI estimation from unmanned aerial vehicle (UAV) hyperspectral data. Based on Taylor expansion method and computational geometry model, a scale transfer model considering both difference between inter- and intra-class is constructed for scale effect analysis of LAI inversion over inhomogeneous surface. The results indicate that, (1) the LUT method based on classification and parameter sensitive analysis is useful for LAI retrieval of corn, potato, sunflower and melon on the typical farmland, with correlation coefficient R2 of 0.82 and root mean square error RMSE of 0.43m2/m-2. (2) The scale effect of LAI is becoming obvious with the decrease of image resolution, and maximum scale bias is more than 45%. (3) The scale effect of inter-classes is higher than that of intra-class, which can be corrected efficiently by the scale transfer model established based Taylor expansion and Computational geometry. After corrected, the maximum scale bias can be reduced to 1.2%.

Keywords: leaf area index (LAI), scale effect, UAV-based hyperspectral data, look-up-table (LUT), remote sensing

Procedia PDF Downloads 439
14017 Deep Learning Based Polarimetric SAR Images Restoration

Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo ferraioli

Abstract:

In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring . SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.

Keywords: SAR image, deep learning, convolutional neural network, deep neural network, SAR polarimetry

Procedia PDF Downloads 84
14016 Spray Drying: An Innovative and Sustainable Method of Preserving Fruits

Authors: Adepoju Abiola Lydia, Adeyanju James Abiodun, Abioye A. O.

Abstract:

Spray drying, an innovative and sustainable preservation method, is increasingly gaining recognition for its potential to enhance food security by extending the shelf life of fruits. This technique involves the atomization of fruit pulp into fine droplets, followed by rapid drying with hot air, resulting in a powdered product that retains much of the original fruit's nutritional value, flavor, and color. By encapsulating sensitive bioactive compounds within a dry matrix, spray drying mitigates nutrient degradation and extends product usability. This technology aligns with sustainability goals by reducing post-harvest losses, minimizing the need for preservatives, and lowering energy consumption compared to conventional drying methods. Furthermore, spray drying enables the use of imperfect or surplus fruits, contributing to waste reduction and providing a continuous supply of nutritious fruit-based ingredients regardless of seasonal variations. The powdered form enhances versatility, allowing incorporation into various food products, thus broadening the scope of fruit utilization. Innovations in spray drying, such as the use of novel carrier agents and optimization of processing parameters, enhance the quality and functionality of the final product. Moreover, the scalability of spray drying makes it suitable for both industrial applications and smaller-scale operations, supporting local economies and food systems. In conclusion, spray drying stands out as a key technology in enhancing food security by ensuring a stable supply of high-quality, nutritious food ingredients while fostering sustainable agricultural practices.

Keywords: spray drying, sustainable, process parameters, carrier agents, fruits

Procedia PDF Downloads 9
14015 Event Data Representation Based on Time Stamp for Pedestrian Detection

Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita

Abstract:

In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.

Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption

Procedia PDF Downloads 90
14014 Telemedicine in Physician Assistant Education: A Partnership with Community Agency

Authors: Martina I. Reinhold, Theresa Bacon-Baguley

Abstract:

A core challenge of physician assistant education is preparing professionals for lifelong learning. While this conventionally has encompassed scientific advances, students must also embrace new care delivery models and technologies. Telemedicine, the provision of care via two-way audio and video, is an example of a technological advance reforming health care. During a three-semester sequence of Hospital Community Experiences, physician assistant students were assigned experiences with Answer Health on Demand, a telemedicine collaborative. Preceding the experiences, the agency lectured on the application of telemedicine. Students were then introduced to the technology and partnered with a provider. Prior to observing the patient-provider interaction, patient consent was obtained. Afterwards, students completed a reflection paper on lessons learned and the potential impact of telemedicine on their careers. Thematic analysis was completed on the students’ reflection papers (n=13). Preceding the lecture and experience, over 75% of students (10/13) were unaware of telemedicine. Several stated they were 'skeptical' about the effectiveness of 'impersonal' health care appointments. After the experience, all students remarked that telemedicine will play a large role in the future of healthcare and will provide benefits by improving access in rural areas, decreasing wait time, and saving cost. More importantly, 30% of students (4/13) commented that telemedicine is a technology they can see themselves using in their future practice. Initial results indicate that collaborative interaction between students and telemedicine providers enhanced student learning and exposed students to technological advances in the delivery of care. Further, results indicate that students perceived telemedicine more favorably as a viable delivery method after the experience.

Keywords: collaboration, physician assistant education, teaching innovative health care delivery method, telemedicine

Procedia PDF Downloads 193
14013 Self-Supervised Attributed Graph Clustering with Dual Contrastive Loss Constraints

Authors: Lijuan Zhou, Mengqi Wu, Changyong Niu

Abstract:

Attributed graph clustering can utilize the graph topology and node attributes to uncover hidden community structures and patterns in complex networks, aiding in the understanding and analysis of complex systems. Utilizing contrastive learning for attributed graph clustering can effectively exploit meaningful implicit relationships between data. However, existing attributed graph clustering methods based on contrastive learning suffer from the following drawbacks: 1) Complex data augmentation increases computational cost, and inappropriate data augmentation may lead to semantic drift. 2) The selection of positive and negative samples neglects the intrinsic cluster structure learned from graph topology and node attributes. Therefore, this paper proposes a method called self-supervised Attributed Graph Clustering with Dual Contrastive Loss constraints (AGC-DCL). Firstly, Siamese Multilayer Perceptron (MLP) encoders are employed to generate two views separately to avoid complex data augmentation. Secondly, the neighborhood contrastive loss is introduced to constrain node representation using local topological structure while effectively embedding attribute information through attribute reconstruction. Additionally, clustering-oriented contrastive loss is applied to fully utilize clustering information in global semantics for discriminative node representations, regarding the cluster centers from two views as negative samples to fully leverage effective clustering information from different views. Comparative clustering results with existing attributed graph clustering algorithms on six datasets demonstrate the superiority of the proposed method.

Keywords: attributed graph clustering, contrastive learning, clustering-oriented, self-supervised learning

Procedia PDF Downloads 44
14012 Effects of Ultraviolet Treatment on Microbiological Load and Phenolic Content of Vegetable Juice

Authors: Kubra Dogan, Fatih Tornuk

Abstract:

Due to increasing consumer demand for the high-quality food products and awareness regarding the health benefits of different nutrients in food minimal processing becomes more popular in modern food preservation. To date, heat treatment is often used for inactivation of spoilage microorganisms in foods. However, it may cause significant changes in the quality and nutritional properties of food. In order to overcome the detrimental effects of heat treatment, several alternatives of non-thermal microbial inactivation processes have been investigated. Ultraviolet (UV) inactivation is a promising and feasible method for better quality and longer shelf life as an alternative to heat treatment, which aims to inhibit spoilage and pathogenic microorganisms and to inactivate the enzymes in vegetable juice production. UV-C is a sub-class of UV treatment which shows the highest microcidal effect between 250-270 nm. The wavelength of 254 nm is used for the surface disinfection of certain liquid food products such as vegetable juice. Effects of UV-C treatment on microbiological load and quality parameter of vegetable juice which is a mix of celery, carrot, lemon and orange was investigated. Our results showed that storing of UV-C applied vegetable juice for three months, reduced the count of TMAB by 3.5 log cfu/g and yeast-mold by 2 log cfu/g compared to control sample. Total phenolic content was found to be 514.3 ± 0.6 mg gallic acid equivalent/L, and there wasn’t a significant difference compared to control. The present work suggests that UV-C treatment is an alternative method for disinfection of vegetable juice since it enables adequate microbial inactivation, longer shelf life and has minimal effect on degradation of quality parameters of vegetable juice.

Keywords: heat treatment, phenolic content, shelf life, ultraviolet (UV-C), vegetable juice

Procedia PDF Downloads 205
14011 The Design Method of Artificial Intelligence Learning Picture: A Case Study of DCAI's New Teaching

Authors: Weichen Chang

Abstract:

To create a guided teaching method for AI generative drawing design, this paper develops a set of teaching models for AI generative drawing (DCAI), which combines learning modes such as problem-solving, thematic inquiry, phenomenon-based, task-oriented, and DFC . Through the information security AI picture book learning guided programs and content, the application of participatory action research (PAR) and interview methods to explore the dual knowledge of Context and ChatGPT (DCAI) for AI to guide the development of students' AI learning skills. In the interviews, the students highlighted five main learning outcomes (self-study, critical thinking, knowledge generation, cognitive development, and presentation of work) as well as the challenges of implementing the model. Through the use of DCAI, students will enhance their consensus awareness of generative mapping analysis and group cooperation, and they will have knowledge that can enhance AI capabilities in DCAI inquiry and future life. From this paper, it is found that the conclusions are (1) The good use of DCAI can assist students in exploring the value of their knowledge through the power of stories and finding the meaning of knowledge communication; (2) Analyze the transformation power of the integrity and coherence of the story through the context so as to achieve the tension of ‘starting and ending’; (3) Use ChatGPT to extract inspiration, arrange story compositions, and make prompts that can communicate with people and convey emotions. Therefore, new knowledge construction methods will be one of the effective methods for AI learning in the face of artificial intelligence, providing new thinking and new expressions for interdisciplinary design and design education practice.

Keywords: artificial intelligence, task-oriented, contextualization, design education

Procedia PDF Downloads 26
14010 Sequential and Combinatorial Pre-Treatment Strategy of Lignocellulose for the Enhanced Enzymatic Hydrolysis of Spent Coffee Waste

Authors: Rajeev Ravindran, Amit K. Jaiswal

Abstract:

Waste from the food-processing industry is produced in large amount and contains high levels of lignocellulose. Due to continuous accumulation throughout the year in large quantities, it creates a major environmental problem worldwide. The chemical composition of these wastes (up to 75% of its composition is contributed by polysaccharide) makes it inexpensive raw material for the production of value-added products such as biofuel, bio-solvents, nanocrystalline cellulose and enzymes. In order to use lignocellulose as the raw material for the microbial fermentation, the substrate is subjected to enzymatic treatment, which leads to the release of reducing sugars such as glucose and xylose. However, the inherent properties of lignocellulose such as presence of lignin, pectin, acetyl groups and the presence of crystalline cellulose contribute to recalcitrance. This leads to poor sugar yields upon enzymatic hydrolysis of lignocellulose. A pre-treatment method is generally applied before enzymatic treatment of lignocellulose that essentially removes recalcitrant components in biomass through structural breakdown. Present study is carried out to find out the best pre-treatment method for the maximum liberation of reducing sugars from spent coffee waste (SPW). SPW was subjected to a range of physical, chemical and physico-chemical pre-treatment followed by a sequential, combinatorial pre-treatment strategy is also applied on to attain maximum sugar yield by combining two or more pre-treatments. All the pre-treated samples were analysed for total reducing sugar followed by identification and quantification of individual sugar by HPLC coupled with RI detector. Besides, generation of any inhibitory compounds such furfural, hydroxymethyl furfural (HMF) which can hinder microbial growth and enzyme activity is also monitored. Results showed that ultrasound treatment (31.06 mg/L) proved to be the best pre-treatment method based on total reducing content followed by dilute acid hydrolysis (10.03 mg/L) while galactose was found to be the major monosaccharide present in the pre-treated SPW. Finally, the results obtained from the study were used to design a sequential lignocellulose pre-treatment protocol to decrease the formation of enzyme inhibitors and increase sugar yield on enzymatic hydrolysis by employing cellulase-hemicellulase consortium. Sequential, combinatorial treatment was found better in terms of total reducing yield and low content of the inhibitory compounds formation, which could be due to the fact that this mode of pre-treatment combines several mild treatment methods rather than formulating a single one. It eliminates the need for a detoxification step and potential application in the valorisation of lignocellulosic food waste.

Keywords: lignocellulose, enzymatic hydrolysis, pre-treatment, ultrasound

Procedia PDF Downloads 361
14009 Bulk Transport in Strongly Correlated Topological Insulator Samarium Hexaboride Using Hall Effect and Inverted Resistance Methods

Authors: Alexa Rakoski, Yun Suk Eo, Cagliyan Kurdak, Priscila F. S. Rosa, Zachary Fisk, Monica Ciomaga Hatnean, Geetha Balakrishnan, Boyoun Kang, Myungsuk Song, Byungki Cho

Abstract:

Samarium hexaboride (SmB6) is a strongly correlated mixed valence material and Kondo insulator. In the resistance-temperature curve, SmB6 exhibits activated behavior from 4-40 K after the Kondo gap forms. However, below 4 K, the resistivity is temperature independent or weakly temperature dependent due to the appearance of a topologically protected surface state. Current research suggests that the surface of SmB6 is conductive while the bulk is truly insulating, different from conventional 3D TIs (Topological Insulators) like Bi₂Se₃ which are plagued by bulk conduction due to impurities. To better understand why the bulk of SmB6 is so different from conventional TIs, this study employed a new method, called inverted resistance, to explore the lowest temperatures, as well as standard Hall measurements for the rest of the temperature range. In the inverted resistance method, current flows from an inner contact to an outer ring, and voltage is measured outside of this outer ring. This geometry confines the surface current and allows for measurement of the bulk resistivity even when the conductive surface dominates transport (below 4 K). The results confirm that the bulk of SmB6 is truly insulating down to 2 K. Hall measurements on a number of samples show consistent bulk behavior from 4-40 K, but widely varying behavior among samples above 40 K. This is attributed to a combination of the growth process and purity of the starting material, and the relationship between the high and low temperature behaviors is still being explored.

Keywords: bulk transport, Hall effect, inverted resistance, Kondo insulator, samarium hexaboride, topological insulator

Procedia PDF Downloads 157
14008 Analysis of Factors Affecting the Number of Infant and Maternal Mortality in East Java with Geographically Weighted Bivariate Generalized Poisson Regression Method

Authors: Luh Eka Suryani, Purhadi

Abstract:

Poisson regression is a non-linear regression model with response variable in the form of count data that follows Poisson distribution. Modeling for a pair of count data that show high correlation can be analyzed by Poisson Bivariate Regression. Data, the number of infant mortality and maternal mortality, are count data that can be analyzed by Poisson Bivariate Regression. The Poisson regression assumption is an equidispersion where the mean and variance values are equal. However, the actual count data has a variance value which can be greater or less than the mean value (overdispersion and underdispersion). Violations of this assumption can be overcome by applying Generalized Poisson Regression. Characteristics of each regency can affect the number of cases occurred. This issue can be overcome by spatial analysis called geographically weighted regression. This study analyzes the number of infant mortality and maternal mortality based on conditions in East Java in 2016 using Geographically Weighted Bivariate Generalized Poisson Regression (GWBGPR) method. Modeling is done with adaptive bisquare Kernel weighting which produces 3 regency groups based on infant mortality rate and 5 regency groups based on maternal mortality rate. Variables that significantly influence the number of infant and maternal mortality are the percentages of pregnant women visit health workers at least 4 times during pregnancy, pregnant women get Fe3 tablets, obstetric complication handled, clean household and healthy behavior, and married women with the first marriage age under 18 years.

Keywords: adaptive bisquare kernel, GWBGPR, infant mortality, maternal mortality, overdispersion

Procedia PDF Downloads 156
14007 Optimal Operation of Bakhtiari and Roudbar Dam Using Differential Evolution Algorithms

Authors: Ramin Mansouri

Abstract:

Due to the contrast of rivers discharge regime with water demands, one of the best ways to use water resources is to regulate the natural flow of the rivers and supplying water needs to construct dams. Optimal utilization of reservoirs, consideration of multiple important goals together at the same is of very high importance. To study about analyzing this method, statistical data of Bakhtiari and Roudbar dam over 46 years (1955 until 2001) is used. Initially an appropriate objective function was specified and using DE algorithm, the rule curve was developed. In continue, operation policy using rule curves was compared to standard comparative operation policy. The proposed method distributed the lack to the whole year and lowest damage was inflicted to the system. The standard deviation of monthly shortfall of each year with the proposed algorithm was less deviated than the other two methods. The Results show that median values for the coefficients of F and Cr provide the optimum situation and cause DE algorithm not to be trapped in local optimum. The most optimal answer for coefficients are 0.6 and 0.5 for F and Cr coefficients, respectively. After finding the best combination of coefficients values F and CR, algorithms for solving the independent populations were examined. For this purpose, the population of 4, 25, 50, 100, 500 and 1000 members were studied in two generations (G=50 and 100). result indicates that the generation number 200 is suitable for optimizing. The increase in time per the number of population has almost a linear trend, which indicates the effect of population in the runtime algorithm. Hence specifying suitable population to obtain an optimal results is very important. Standard operation policy had better reversibility percentage, but inflicts severe vulnerability to the system. The results obtained in years of low rainfall had very good results compared to other comparative methods.

Keywords: reservoirs, differential evolution, dam, Optimal operation

Procedia PDF Downloads 73
14006 Impact of Ethnoscience-Based Teaching Approach: Thinking Relevance, Effectiveness and Learner Retention in Physics Concepts of Optics

Authors: Rose C.Anamezie, Mishack T. Gumbo

Abstract:

Physics learners’ poor retention, which culminates in poor achievement due to teaching approaches that are unrelated to learners’ in non-Western cultures, warranted the study. The tenet of this study was to determine the effectiveness of the ethnoscience-based teaching (EBT) approach on learners’ retention in the Physics concept of Optics in the Awka Education zone of Anambra State- Nigeria. Two research questions and three null hypotheses tested at a 0.05 level of significance guided the study. The design adopted for the study was Quasi-experimental. Specifically, a non-equivalent control group design was adopted. The population for the study was 4,825 SS2 Physics learners in the zone. 160 SS2 learners were sampled using purposive and random sampling. The experimental group was taught rectilinear propagation of light (RPL) using the EBT approach, while the control group was taught the same topic using the lecture method. The instrument for data collection was the 50 Physics Retention Test (PRT) which was validated by three experts and tested for reliability using Kuder-Richardson’s formula-20, which yielded coefficients of 0.81. The data were analysed using mean, standard deviation and analysis of co-variance (p< .05). The results showed higher retention for the use of the EBT approach than the lecture method, while there was no significant gender-based factor in the learners’ retention in Physics. It was recommended that the EBT approach, which bridged the gender gap in Physics retention, be adopted in secondary school teaching and learning since it could transform science teaching, enhance learners’ construction of new science concepts based on their existing knowledge and bridge the gap between Western science and learners’ worldviews.

Keywords: Ethnoscience-based teaching, optics, rectilinear propagation of light, retention

Procedia PDF Downloads 78
14005 Sea Protection: Using Marine Algae as a Natural Method of Absorbing Dye Textile Waste

Authors: Ariana Kilic, Serena Arapyan

Abstract:

Water pollution is a serious concern in all seas around the world and one major cause of it is dye textile wastes mixing with seawater. This common incident alters aquatic life, putting organisms’ lives in danger and deteriorating the water's nature. There is a significant need for a natural approach to reduce the amount of dye textile waste in seawater and ensure marine organisms' safety. Consequently, using marine algae is a viable solution since it can eliminate the excess waste by absorbing the dye. Also, marine algae are non-vascular that absorb water and nutrients, meaning that having them as absorbers is a natural process and no inorganic matters will be added to the seawater that could result in further pollution. To test the efficiency of this approach, the optical absorbance of the seawater samples was measured before and after the addition of marine algae by utilizing colorimetry. A colorimeter is used to find the concentration of a chemical compound in a solution by measuring the absorbance of the compound at a specific wavelength. Samples of seawater that have equal amounts of water were used and textile dye was added as the constant variables. The initial and final absorbances, the dependent variable, of the water were measured before and after the addition of marine algae, the independent variable, and observed. The lower the absorbance showed us that there is lower dye concentration and therefore, the marine algae had done its job by using and absorbing the dye. The same experiment was repeated with same amount of water but with different concentrations of dye in order to determine the maximum concentration of dye the marine algae can completely absorb. The diminished concentration of dye demonstrated that pollution caused by factories’ dye wastes could be prevented with the natural method of marine algae. The involvement of marine algae is an optimal strategy for having an organic solution to absorbing the dye wastes in seas and obstructing water pollution.

Keywords: water pollution, dye textile waste, marine algae, absorbance, colorimetry

Procedia PDF Downloads 15
14004 Impact of Climate Change on Some Physiological Parameters of Cyclic Female Egyptian Buffalo

Authors: Nabil Abu-Heakal, Ismail Abo-Ghanema, Basma Hamed Merghani

Abstract:

The aim of this investigation is to study the effect of seasonal variations in Egypt on hematological parameters, reproductive and metabolic hormones of Egyptian buffalo-cows. This study lasted one year extending from December 2009 to November 2010 and was conducted on sixty buffalo-cows. Group of 5 buffalo-cows at estrus phase were selected monthly. Then, after blood sampling through tail vein puncture in the 2nd day after natural service, they were divided in two samples: one with anticoagulant for hematological analysis and the other without anticoagulant for serum separation. Results of this investigation revealed that the highest atmospheric temperature was in hot summer 32.61±1.12°C versus 26.18±1.67°C in spring and 19.92±0.70°C in winter season, while the highest relative humidity % was in winter season 43.50±1.60% versus 32.50±2.29% in summer season. The rise in temperature-humidity index from 63.73±1.29 in winter to 78.53±1.58 in summer indicates severe heat stress which is associated with significant reduction in total red blood cell count (3.20±0.15×106), hemoglobin concentration (8.83±0.43 g/dl), packed cell volume (30.73±0.12%), lymphocytes % (40.66±2.33 %), serum progesterone hormone concentration (0.56±0.03 ng/mll), estradiol17-B concentration (16.8±0.64 ng/ml), triiodothyronin (T3) concentration (2.33±0.33 ng/ml) and thyroxin hormone (T4) concentration (21.66±1.66 ng/ml), while hot summer resulted in significant increase in mean cell volume (96.55±2.25 fl), mean cell hemoglobin (30.81±1.33 pg), total white blood cell count (10.63±0.97×103), neutrophils % (49.66±2.33%), serum prolactin hormone (PRL) concentration (23.45±1.72 ng/ml) and cortisol hormone concentration (4.47±0.33 ng/ml) compared to winter season. There was no significant seasonal variation in mean cell hemoglobin concentration (MCHC). It was concluded that in Egypt there was a seasonal variation in atmospheric temperature, relative humidity, temperature humidity index (THI) and the rise in THI above the upper critical level (72 units), which, for lactating buffalo-cows in Egypt is the major constraint on buffalo-cows' hematological parameters and hormonal secretion that affects animal reproduction. Hence, we should improve climatic conditions inside the dairy farm to eliminate or reduce summer infertility.

Keywords: buffalo, climate change, Egypt, physiological parameters

Procedia PDF Downloads 649
14003 Investigation of Electrospun Composites Nanofiber of Poly (Lactic Acid)/Hazelnut Shell Powder/Zinc Oxide

Authors: Ibrahim Sengor, Sumeyye Cesur, Ilyas Kartal, Faik Nuzhet Oktar, Nazmi Ekren, Ahmet Talat Inan, Oguzhan Gunduz

Abstract:

In recent years, many researchers focused on nano-size fiber production. Nanofibers have been studied due to their different and superior physical, chemical and mechanical properties. Poly (lactic acid) (PLA), is a type of biodegradable thermoplastic polyester derived from renewable sources used in biomedical owing to its biocompatibility and biodegradability. In addition, zinc oxide is an antibacterial material and hazelnut shell powder is a filling material. In this study, nanofibers were obtained by adding of different ratio Zinc oxide, (ZnO) and hazelnut shell powder at different concentration into Poly (lactic acid) (PLA) by using electrospinning method which is the most common method to obtain nanofibers. After dissolving the granulated polylactic acids in % 1,% 2,% 3 and% 4 with chloroform solvent, they are homogenized by adding tween and hazelnut shell powder at different ratios and then by electrospinning, nanofibers are obtained. Scanning electron microscope (SEM), Fourier transform infrared spectroscopy (FTIR), Differential scanning calorimeter (DSC) and physical analysis such as density, electrical conductivity, surface tension, viscosity measurement and antimicrobial test were carried out after production process. The resulting structures of the nanofiber possess antimicrobial and antiseptic properties, which are attractive for biomedical applications. The resulting structures of the nanofiber possess antimicrobial, non toxic, self-cleaning and rigid properties, which are attractive for biomedical applications.

Keywords: electrospinning, hazelnut shell powder, nanofibers, poly (lactic acid), zinc oxide

Procedia PDF Downloads 158