Search results for: orthogonal design method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28116

Search results for: orthogonal design method

18846 The Trajectory of the Ball in Football Game

Authors: Mahdi Motahari, Mojtaba Farzaneh, Ebrahim Sepidbar

Abstract:

Tracking of moving and flying targets is one of the most important issues in image processing topic. Estimating of trajectory of desired object in short-term and long-term scale is more important than tracking of moving and flying targets. In this paper, a new way of identifying and estimating of future trajectory of a moving ball in long-term scale is estimated by using synthesis and interaction of image processing algorithms including noise removal and image segmentation, Kalman filter algorithm in order to estimating of trajectory of ball in football game in short-term scale and intelligent adaptive neuro-fuzzy algorithm based on time series of traverse distance. The proposed system attain more than 96% identify accuracy by using aforesaid methods and relaying on aforesaid algorithms and data base video in format of synthesis and interaction. Although the present method has high precision, it is time consuming. By comparing this method with other methods we realize the accuracy and efficiency of that.

Keywords: tracking, signal processing, moving targets and flying, artificial intelligent systems, estimating of trajectory, Kalman filter

Procedia PDF Downloads 447
18845 Effect of Precursors Aging Time on the Photocatalytic Activity of Zno Thin Films

Authors: N. Kaneva, A. Bojinova, K. Papazova

Abstract:

Thin ZnO films are deposited on glass substrates via sol–gel method and dip-coating. The films are prepared from zinc acetate dehydrate as a starting reagent. After that the as-prepared ZnO sol is aged for different periods (0, 1, 3, 5, 10, 15, and 30 days). Nanocrystalline thin films are deposited from various sols. The effect ZnO sols aging time on the structural and photocatalytic properties of the films is studied. The films surface is studied by Scanning Electron Microscopy. The effect of the aging time of the starting solution is studied inrespect to photocatalytic degradation of Reactive Black 5 (RB5) by UV-vis spectroscopy. The experiments are conducted upon UV-light illumination and in complete darkness. The variation of the absorption spectra shows the degradation of RB5 dissolved in water, as a result of the reaction acurring on the surface of the films, and promoted by UV irradiation. The initial concentrations of dye (5, 10 and 20 ppm) and the effect of the aging time are varied during the experiments. The results show, that the increasing aging time of starting solution with respect to ZnO generally promotes photocatalytic activity. The thin films obtained from ZnO sol, which is aged 30 days have best photocatalytic degradation of the dye (97,22%) in comparison with the freshly prepared ones (65,92%). The samples and photocatalytic experimental results are reproducible. Nevertheless, all films exhibit a substantial activity in both UV light and darkness, which is promising for the development of new ZnO photocatalysts by sol-gel method.

Keywords: ZnO thin films, sol-gel, photocatalysis, aging time

Procedia PDF Downloads 366
18844 Stress Analysis of Tubular Bonded Joints under Torsion and Hygrothermal Effects Using DQM

Authors: Mansour Mohieddin Ghomshei, Reza Shahi

Abstract:

Laminated composite tubes with adhesively bonded joints are widely used in aerospace and automotive industries as well as oil and gas industries. In this research, adhesively tubular single lap joints subjected to torsional and hygrothermal loadings are studied using the differential quadrature method (DQM). The analysis is based on the classical shell theory. At first, an approximate closed form solution is developed by omitting the lateral deflections in the connecting tubes. Using the analytical model, the circumferential displacements in tubes and the shear stresses in the interfacing adhesive layer are determined. Then, a numerical formulation is presented using DQM in which the lateral deflections are taken into account. By using the DQM formulation, the circumferential and radial displacements in tubes as well as shear and peel stresses in the adhesive layer are calculated. Results obtained from the proposed DQM solutions are compared well with those of the approximate analytical model and those of some published references. Finally using the DQM model, parametric studies are carried out to investigate the influence of various parameters such as adhesive layer thickness, torsional loading, overlap length, tubes radii, relative humidity, and temperature.

Keywords: adhesively bonded joint, differential quadrature method (DQM), hygrothermal, laminated composite tube

Procedia PDF Downloads 289
18843 Field Performance of Cement Treated Bases as a Reflective Crack Mitigation Technique for Flexible Pavements

Authors: Mohammad R. Bhuyan, Mohammad J. Khattak

Abstract:

Deterioration of flexible pavements due to crack reflection from its soil-cement base layer is a major concern around the globe. The service life of flexible pavement diminishes significantly because of the reflective cracks. Highway agencies are struggling for decades to prevent or mitigate these cracks in order to increase pavement service lives. The root cause of reflective cracks is the shrinkage crack which occurs in the soil-cement bases during the cement hydration process. The primary factor that causes the shrinkage is the cement content of the soil-cement mixture. With the increase of cement content, the soil-cement base gains strength and durability, which is necessary to withstand the traffic loads. But at the same time, higher cement content creates more shrinkage resulting in more reflective cracks in pavements. Historically, various states of USA have used the soil-cement bases for constructing flexile pavements. State of Louisiana (USA) had been using 8 to 10 percent of cement content to manufacture the soil-cement bases. Such traditional soil-cement bases yield 2.0 MPa (300 psi) 7-day compressive strength and are termed as cement stabilized design (CSD). As these CSD bases generate significant reflective cracks, another design of soil-cement base has been utilized by adding 4 to 6 percent of cement content called cement treated design (CTD), which yields 1.0 MPa (150 psi) 7-day compressive strength. The reduction of cement content in the CTD base is expected to minimize shrinkage cracks thus increasing pavement service lives. Hence, this research study evaluates the long-term field performance of CTD bases with respect to CSD bases used in flexible pavements. Pavement Management System of the state of Louisiana was utilized to select flexible pavement projects with CSD and CTD bases that had good historical record and time-series distress performance data. It should be noted that the state collects roughness and distress data for 1/10th mile section every 2-year period. In total, 120 CSD and CTD projects were analyzed in this research, where more than 145 miles (CTD) and 175 miles (CSD) of roadways data were accepted for performance evaluation and benefit-cost analyses. Here, the service life extension and area based on distress performance were considered as benefits. It was found that CTD bases increased 1 to 5 years of pavement service lives based on transverse cracking as compared to CSD bases. On the other hand, the service lives based on longitudinal and alligator cracking, rutting and roughness index remain the same. Hence, CTD bases provide some service life extension (2.6 years, on average) to the controlling distress; transverse cracking, but it was inexpensive due to its lesser cement content. Consequently, CTD bases become 20% more cost-effective than the traditional CSD bases, when both bases were compared by net benefit-cost ratio obtained from all distress types.

Keywords: cement treated base, cement stabilized base, reflective cracking , service life, flexible pavement

Procedia PDF Downloads 157
18842 Synthesis, Characterization and Rheological Properties of Boronoxide, Polymer Nanocomposites

Authors: Mehmet Doğan, Mahir Alkan, Yasemin Turhan, Zürriye Gündüz, Pinar Beyli, Serap Doğan

Abstract:

Advances and new discoveries in the field of the material science on the basis of technological developments have played an important role. Today, material science is branched the lower branches such as metals, nonmetals, chemicals, polymers. The polymeric nano composites have found a wide application field as one of the most important among these groups. Many polymers used in the different fields of the industry have been desired to improve the thermal stability. One of the ways to improve this property of the polymers is to form the nano composite products of them using different fillers. There are many using area of boron compounds and is increasing day by day. In order to the further increasing of the variety of using area of boron compounds and industrial importance, it is necessary to synthesis of nano-products and to find yourself new application areas of these products. In this study, PMMA/boronoxide nano composites were synthesized using solution intercalation, polymerization and melting methods; and PAA/boronoxide nano composites using solution intercalation method. Furthermore, rheological properties of nano composites synthesed according to melting method were also studied. Nano composites were characterized by XRD, FTIR-ATR, DTA/TG, BET, SEM, and TEM instruments. The effects of filler material amount, solvent types and mediating reagent on the thermal stability of polymers were investigated. In addition, the rheological properties of PMMA/boronoxide nano composites synthesized by melting method were investigated using High Pressure Capillary Rheometer. XRD analysis showed that boronoxide was dispersed in polymer matrix; FTIR-ATR that there were interactions with boronoxide between PAA and PMMA; and TEM that boronoxide particles had spherical structure, and dispersed in nano sized dimension in polymer matrix; the thermal stability of polymers was increased with the adding of boronoxide in polymer matrix; the decomposition mechanism of PAA was changed. From rheological measurements, it was found that PMMA and PMMA/boronoxide nano composites exhibited non-Newtonian, pseudo-plastic, shear thinning behavior under all experimental conditions.

Keywords: boronoxide, polymer, nanocomposite, rheology, characterization

Procedia PDF Downloads 414
18841 Study on Accurate Calculation Method of Model Attidude on Wind Tunnel Test

Authors: Jinjun Jiang, Lianzhong Chen, Rui Xu

Abstract:

The accurate of model attitude angel plays an important role on the aerodynamic test results in the wind tunnel test. The original method applies the spherical coordinate system transformation to obtain attitude angel calculation.The model attitude angel is obtained by coordinate transformation and spherical surface mapping applying the nominal attitude angel (the balance attitude angel in the wind tunnel coordinate system) indicated by the mechanism. First, the coordinate transformation of this method is not only complex but also difficult to establish the transformed relationship between the space coordinate systems especially after many steps of coordinate transformation, moreover it cannot realize the iterative calculation of the interference relationship between attitude angels; Second, during the calculate process to solve the problem the arc is approximately used to replace the straight line, the angel for the tangent value, and the inverse trigonometric function is applied. Therefore, in the calculation of attitude angel, the process is complex and inaccurate, which can be solved approximately when calculating small attack angel. However, with the advancing development of modern aerodynamic unsteady research, the aircraft tends to develop high or super large attack angel and unsteadyresearch field.According to engineering practice and vector theory, the concept of vector angel coordinate systemis proposed for the first time, and the vector angel coordinate system of attitude angel is established.With the iterative correction calculation and avoiding the problem of approximate and inverse trigonometric function solution, the model attitude calculation process is carried out in detail, which validates that the calculation accuracy and accuracy of model attitude angels are improved.Based on engineering and theoretical methods, a vector angel coordinate systemis established for the first time, which gives the transformation and angel definition relations between different flight attitude coordinate systems, that can accurately calculate the attitude angel of the corresponding coordinate systemand determine its direction, especially in the channel coupling calculation, the calculation of the attitude angel between the coordinate systems is only related to the angel, and has nothing to do with the change order s of the coordinate system, whichsimplifies the calculation process.

Keywords: attitude angel, angel vector coordinate system, iterative calculation, spherical coordinate system, wind tunnel test

Procedia PDF Downloads 117
18840 Transient Response of Elastic Structures Subjected to a Fluid Medium

Authors: Helnaz Soltani, J. N. Reddy

Abstract:

Presence of fluid medium interacting with a structure can lead to failure of the structure. Since developing efficient computational model for fluid-structure interaction (FSI) problems has broader impact to realistic problems encountered in aerospace industry, ship industry, oil and gas industry, and so on, one can find an increasing need to find a method in order to investigate the effect of fluid domain on structural response. A coupled finite element formulation of problems involving FSI issue is an accurate method to predict the response of structures in contact with a fluid medium. This study proposes a finite element approach in order to study the transient response of the structures interacting with a fluid medium. Since beam and plate are considered to be the fundamental elements of almost any structure, the developed method is applied to beams and plates benchmark problems in order to demonstrate its efficiency. The formulation is a combination of the various structure theories and the solid-fluid interface boundary condition, which is used to represent the interaction between the solid and fluid regimes. Here, three different beam theories as well as three different plate theories are considered to model the solid medium, and the Navier-Stokes equation is used as the theoretical equation governed the fluid domain. For each theory, a coupled set of equations is derived where the element matrices of both regimes are calculated by Gaussian quadrature integration. The main feature of the proposed methodology is to model the fluid domain as an added mass; the external distributed force due to the presence of the fluid. We validate the accuracy of such formulation by means of some numerical examples. Since the formulation presented in this study covers several theories in literature, the applicability of our proposed approach is independent of any structure geometry. The effect of varying parameters such as structure thickness ratio, fluid density and immersion depth, are studied using numerical simulations. The results indicate that maximum vertical deflection of the structure is affected considerably in the presence of a fluid medium.

Keywords: beam and plate, finite element analysis, fluid-structure interaction, transient response

Procedia PDF Downloads 553
18839 A Thermo-mechanical Finite Element Model to Predict Thermal Cycles and Residual Stresses in Directed Energy Deposition Technology

Authors: Edison A. Bonifaz

Abstract:

In this work, a numerical procedure is proposed to design dense multi-material structures using the Directed Energy Deposition (DED) process. A thermo-mechanical finite element model to predict thermal cycles and residual stresses is presented. A numerical layer build-up procedure coupled with a moving heat flux was constructed to minimize strains and residual stresses that result in the multi-layer deposition of an AISI 316 austenitic steel on an AISI 304 austenitic steel substrate. To simulate the DED process, the automated interface of the ABAQUS AM module was used to define element activation and heat input event data as a function of time and position. Of this manner, the construction of ABAQUS user-defined subroutines was not necessary. Thermal cycles and thermally induced stresses created during the multi-layer deposition metal AM pool crystallization were predicted and validated. Results were analyzed in three independent metal layers of three different experiments. The one-way heat and material deposition toolpath used in the analysis was created with a MatLab path script. An optimal combination of feedstock and heat input printing parameters suitable for fabricating multi-material dense structures in the directed energy deposition metal AM process was established. At constant power, it can be concluded that the lower the heat input, the lower the peak temperatures and residual stresses. It means that from a design point of view, the one-way heat and material deposition processing toolpath with the higher welding speed should be selected.

Keywords: event series, thermal cycles, residual stresses, multi-pass welding, abaqus am modeler

Procedia PDF Downloads 50
18838 The Use of Boosted Multivariate Trees in Medical Decision-Making for Repeated Measurements

Authors: Ebru Turgal, Beyza Doganay Erdogan

Abstract:

Machine learning aims to model the relationship between the response and features. Medical decision-making researchers would like to make decisions about patients’ course and treatment, by examining the repeated measurements over time. Boosting approach is now being used in machine learning area for these aims as an influential tool. The aim of this study is to show the usage of multivariate tree boosting in this field. The main reason for utilizing this approach in the field of decision-making is the ease solutions of complex relationships. To show how multivariate tree boosting method can be used to identify important features and feature-time interaction, we used the data, which was collected retrospectively from Ankara University Chest Diseases Department records. Dataset includes repeated PF ratio measurements. The follow-up time is planned for 120 hours. A set of different models is tested. In conclusion, main idea of classification with weighed combination of classifiers is a reliable method which was shown with simulations several times. Furthermore, time varying variables will be taken into consideration within this concept and it could be possible to make accurate decisions about regression and survival problems.

Keywords: boosted multivariate trees, longitudinal data, multivariate regression tree, panel data

Procedia PDF Downloads 193
18837 A New Method of Extracting Polyphenols from Honey Using a Biosorbent Compared to the Commercial Resin Amberlite XAD2

Authors: Farid Benkaci-Alia, Abdelhamid Neggada, Sophie Laurentb

Abstract:

A new extraction method of polyphenols from honey using a biodegradable resin was developed and compared with the common commercial resin amberlite XAD2. For this purpose, three honey samples of Algerian origin were selected for the different physico-chemical and biochemical parameters study. After extraction of the target compounds by both resins, the polyphenol content was determined, the antioxidant activity was tested, and LC-MS analyses were performed for identification and quantification. The results showed that physico-chemical and biochemical parameters meet the norms of the International Honey commission, and the H1 sample seemed to be of high quality. The optimal conditions of extraction by biodegradable resin were a pH of 3, an adsorption dose of 40 g/L, a contact time of 50 min, an extraction temperature of 60°C and no stirring. The regeneration and reuse number of both resins was three cycles. The polyphenol contents demonstrated a higher extraction efficiency of biosorbent than of XAD2, especially in H1. LC-MS analyses allowed for the identification and quantification of fifteen compounds in the different honey samples extracted using both resins and the most abundant compound was 3,4,5-trimethoxybenzoic acid. In addition, the biosorbent extracts showed stronger antioxidant activities than the XAD2 extracts.

Keywords: extraction, polyphénols, biosorbent, resin amberlite, HPLC-MS

Procedia PDF Downloads 92
18836 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach

Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika

Abstract:

Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.

Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments

Procedia PDF Downloads 207
18835 Allelopathic Action of Diferents Sorghum bicolor [L.] Moench Fractions on Ipomoea grandifolia [Dammer] O'Donell

Authors: Mateus L. O. Freitas, Flávia H. de M. Libório, Letycia L. Ricardo, Patrícia da C. Zonetti, Graciene de S. Bido

Abstract:

Weeds compete with agricultural crops for resources such as light, water, and nutrients. This competition can cause significant damage to agricultural producers, and, currently, the use of agrochemicals is the most effective method for controlling these undesirable plants. Morning glory (Ipomoea grandifolia [Dammer] O'Donell) is an aggressive weed and significantly reduces agricultural productivity making harvesting difficult, especially mechanical harvesting. The biggest challenge in modern agriculture is to preserve high productivity reducing environmental damage and maintaining soil characteristics. No-till is a sustainable practice that can reduce the use of agrochemicals and environmental impacts due to the presence of plant residues in the soil, which release allelopathic compounds and reduce the incidence or alter the growth and development of crops and weeds. Sorghum (Sorghum bicolor [L.] Moench) is a forage with proven allelopathic activity, mainly for producing sorgholeone. In this context, this research aimed to evaluate the allelopathic action of sorghum fractions using hexane, dichloromethane, butanol, and ethyl acetate on the germination and initial growth of morning glory. The parameters analyzed were the percentage of germination, speed of germination, seedling length, and biomass weight (fresh and dry). The bioassays were performed in Petri dishes, kept in an incubation chamber for 7 days, at 25 °C, with a 12h photoperiod. The experimental design was completely randomized, with five replicates of each treatment. The data were evaluated by analysis of variance, and the averages between each treatment were compared using the Scott Knott test at a 5% significance level. The results indicated that the dichloromethane and ethyl acetate fractions showed bioherbicidal effects, promoting effective reductions on germination and initial growth of the morning glory. It was concluded that allelochemicals were probably extracted in these fractions. These secondary metabolites can reduce the use of agrochemicals and environmental impact, making agricultural production systems more sustainable.

Keywords: allelochemicals, secondary metabolism, sorgoleone, weeds

Procedia PDF Downloads 136
18834 Screening of Factors Affecting the Enzymatic Hydrolysis of Empty Fruit Bunches in Aqueous Ionic Liquid and Locally Produced Cellulase System

Authors: Md. Z. Alam, Amal A. Elgharbawy, Muhammad Moniruzzaman, Nassereldeen A. Kabbashi, Parveen Jamal

Abstract:

The enzymatic hydrolysis of lignocellulosic biomass is one of the obstacles in the process of sugar production, due to the presence of lignin that protects the cellulose molecules against cellulases. Although the pretreatment of lignocellulose in ionic liquid (IL) system has been receiving a lot of interest; however, it requires IL removal with an anti-solvent in order to proceed with the enzymatic hydrolysis. At this point, introducing a compatible cellulase enzyme seems more efficient in this process. A cellulase enzyme that was produced by Trichoderma reesei on palm kernel cake (PKC) exhibited a promising stability in several ILs. The enzyme called PKC-Cel was tested for its optimum pH and temperature as well as its molecular weight. One among evaluated ILs, 1,3-diethylimidazolium dimethyl phosphate [DEMIM] DMP was applied in this study. Evaluation of six factors was executed in Stat-Ease Design Expert V.9, definitive screening design, which are IL/ buffer ratio, temperature, hydrolysis retention time, biomass loading, cellulase loading and empty fruit bunches (EFB) particle size. According to the obtained data, IL-enzyme system shows the highest sugar concentration at 70 °C, 27 hours, 10% IL-buffer, 35% biomass loading, 60 Units/g cellulase and 200 μm particle size. As concluded from the obtained data, not only the PKC-Cel was stable in the presence of the IL, also it was actually stable at a higher temperature than its optimum one. The reducing sugar obtained was 53.468±4.58 g/L which was equivalent to 0.3055 g reducing sugar/g EFB. This approach opens an insight for more studies in order to understand the actual effect of ILs on cellulases and their interactions in the aqueous system. It could also benefit in an efficient production of bioethanol from lignocellulosic biomass.

Keywords: cellulase, hydrolysis, lignocellulose, pretreatment

Procedia PDF Downloads 354
18833 Development of Personal Protection Equipment for Dental Surgeon

Authors: Thi. A. D. Tran, Matthieu Arnold, Dominique Adolphe, Laurence Schcher, Guillaume Reys

Abstract:

During daily oral health cares, dental surgeons are in contact with numerous potentially infectious germs from patients' saliva and blood. In order to take into account these risks, a product development process has been unrolled to propose to the dental surgeon a personal protection equipment that is suitable with their expectations in terms of images, protection and comfort. After a consumer study, to evaluate how the users wear the garment and their expectations, specifications have been carried out and technical solutions have been developed in order to answer to the maximum of the desiderata. Thermal studies and comfort studies have been performed. The obtained results lead to define the technical solutions concerning the design of the new scrub. Three main functions have been investigated, the ergonomic aspect, the protection and the thermal comfort. In terms of ergonomic aspect, instrumented garments have been worn and pressure measurements have been done. The results highlight that a raglan shape for the sleeves has to be selected for a better dynamic comfort. Moreover, spray tests helped us to localize the potential contamination area and therefore protection devices have been placed on the garment. Concerning the thermal comfort, an I-R study was conducted in consulting room under the real working conditions; the heating zones have been detected. Based on these results, solutions have been proposed and implemented in a new gown. This new gown is currently composed of three different parts; a protective layer placed in the chest area to avoid contamination; a breathable layer placed in the back and in the armpits and a normal PET/Cotton fabric for the rest of the gown. Through the fitting tests conducted in hospital, it was obtained that the new design was highly appreciated. Some points can nevertheless be further improved. A final product will be produced based on necessary improvements.

Keywords: comfort, dentists, garment, thermal

Procedia PDF Downloads 295
18832 Understanding How to Increase Restorativeness of Interiors: A Qualitative Exploratory Study on Attention Restoration Theory in Relation to Interior Design

Authors: Hande Burcu Deniz

Abstract:

People in the U.S. spend a considerable portion of their time indoors. This makes it crucial to provide environments that support the well-being of people. Restorative environments aim to help people recover their cognitive resources that were spent due to intensive use of directed attention. Spending time in nature and taking a nap are two of the best ways to restore these resources. However, they are not possible to do most of the time. The problem is that many studies have revealed how nature and spending time in natural contexts can help boost restoration, but there are fewer studies conducted to understand how cognitive resources can be restored in interior settings. This study aims to explore the answer to this question: which qualities of interiors increase the restorativeness of an interior setting and how do they mediate restorativeness of an interior. To do this, a phenomenological qualitative study was conducted. The study was interested in the definition of attention restoration and the experiences of the phenomena. As the themes emerged, they were analyzed to match with Attention Restoration Theory components (being away, extent, fascination, compatibility) to examine how interior design elements mediate the restorativeness of an interior. The data was gathered from semi-structured interviews with international residents of Minnesota. The interviewees represent young professionals who work in Minnesota and often experience mental fatigue. Also, they have less emotional connections with places in Minnesota, which enabled data to be based on the physical qualities of a space rather than emotional connections. In the interviews, participants were asked about where they prefer to be when they experience mental fatigue. Next, they were asked to describe the physical qualities of the places they prefer to be with reasons. Four themes were derived from the analysis of interviews. The themes are in order according to their frequency. The first, and most common, the theme was “connection to outside”. The analysis showed that people need to be either physically or visually connected to recover from mental fatigue. Direct connection to nature was reported as preferable, whereas urban settings were the secondary preference along with interiors. The second theme emerged from the analysis was “the presence of the artwork,” which was experienced differently by the interviewees. The third theme was “amenities”. Interviews pointed out that people prefer to have the amenities that support desired activity during recovery from mental fatigue. The last theme was “aesthetics.” Interviewees stated that they prefer places that are pleasing to their eyes. Additionally, they could not get rid of the feeling of being worn out in places that are not well-designed. When we matched the themes with the four art components (being away, extent, fascination, compatibility), some of the interior qualities showed overlapping since they were experienced differently by the interviewees. In conclusion, this study showed that interior settings have restorative potential, and they are multidimensional in their experience.

Keywords: attention restoration, fatigue, interior design, qualitative study, restorative environments

Procedia PDF Downloads 240
18831 Micropillar-Assisted Electric Field Enhancement for High-Efficiency Inactivation of Bacteria

Authors: Sanam Pudasaini, A. T. K. Perera, Ahmed Syed Shaheer Uddin, Sum Huan Ng, Chun Yang

Abstract:

Development of high-efficiency and environment friendly bacterial inactivation methods is of great importance for preventing waterborne diseases which are one of the leading causes of death in the world. Traditional bacterial inactivation methods (e.g., ultraviolet radiation and chlorination) have several limitations such as longer treatment time, formation of toxic byproducts, bacterial regrowth, etc. Recently, an electroporation-based inactivation method was introduced as a substitute. Here, an electroporation-based continuous flow microfluidic device equipped with an array of micropillars is developed, and the device achieved high bacterial inactivation performance ( > 99.9%) within a short exposure time ( < 1 s). More than 99.9% reduction of Escherichia coli bacteria was obtained for the flow rate of 1 mL/hr, and no regrowth of bacteria was observed. Images from scanning electron microscope confirmed the formation of electroporation-induced nano-pore within the cell membrane. Through numerical simulation, it has been shown that sufficiently large electric field strength (3 kV/cm), required for bacterial electroporation, were generated using PDMS micropillars for an applied voltage of 300 V. Further, in this method of inactivation, there is no involvement of chemicals and the formation of harmful by-products is also minimum.

Keywords: electroporation, high-efficiency, inactivation, microfluidics, micropillar

Procedia PDF Downloads 165
18830 Towards a Vulnerability Model Assessment of The Alexandra Jukskei Catchment in South Africa

Authors: Vhuhwavho Gadisi, Rebecca Alowo, German Nkhonjera

Abstract:

This article sets out to detail an investigation of groundwater management in the Juksei Catchment of South Africa through spatial mapping of key hydrological relationships, interactions, and parameters in catchments. The Department of Water Affairs (DWA) noted gaps in the implementation of the South African National Water Act 1998: article 16, including the lack of appropriate models for dealing with water quantity parameters. For this reason, this research conducted a drastic GIS-based groundwater assessment to improve groundwater monitoring system in the Juksei River basin catchment of South Africa. The methodology employed was a mixed-methods approach/design that involved the use of DRASTIC analysis, questionnaire, literature review and observations to gather information on how to help people who use the Juskei River. GIS (geographical information system) mapping was carried out using a three-parameter DRASTIC (Depth to water, Recharge, Aquifer media, Soil media, Topography, Impact of the vadose zone, Hydraulic conductivity) vulnerability methodology. In addition, the developed vulnerability map was subjected to sensitivity analysis as a validation method. This approach included single-parameter sensitivity, sensitivity to map deletion, and correlation analysis of DRASTIC parameters. The findings were that approximately 5.7% (45km2) of the area in the northern part of the Juksei watershed is highly vulnerable. Approximately 53.6% (428.8 km^2) of the basin is also at high risk of groundwater contamination. This area is mainly located in the central, north-eastern, and western areas of the sub-basin. The medium and low vulnerability classes cover approximately 18.1% (144.8 km2) and 21.7% (168 km2) of the Jukskei River, respectively. The shallow groundwater of the Jukskei River belongs to a very vulnerable area. Sensitivity analysis indicated that water depth, water recharge, aquifer environment, soil, and topography were the main factors contributing to the vulnerability assessment. The conclusion is that the final vulnerability map indicates that the Juksei catchment is highly susceptible to pollution, and therefore, protective measures are needed for sustainable management of groundwater resources in the study area.

Keywords: contamination, DRASTIC, groundwater, vulnerability, model

Procedia PDF Downloads 69
18829 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network

Authors: Nasrin Bakhshizadeh, Ashkan Forootan

Abstract:

A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.

Keywords: polyethylene, polymerization, density, melt index, neural network

Procedia PDF Downloads 130
18828 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact

Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed

Abstract:

Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).

Keywords: Bayesian network, classification, expert knowledge, structure learning, surface water analysis

Procedia PDF Downloads 115
18827 Alternative Approach to the Machine Vision System Operating for Solving Industrial Control Issue

Authors: M. S. Nikitenko, S. A. Kizilov, D. Y. Khudonogov

Abstract:

The paper considers an approach to a machine vision operating system combined with using a grid of light markers. This approach is used to solve several scientific and technical problems, such as measuring the capability of an apron feeder delivering coal from a lining return port to a conveyor in the technology of mining high coal releasing to a conveyor and prototyping an autonomous vehicle obstacle detection system. Primary verification of a method of calculating bulk material volume using three-dimensional modeling and validation in laboratory conditions with relative errors calculation were carried out. A method of calculating the capability of an apron feeder based on a machine vision system and a simplifying technology of a three-dimensional modelled examined measuring area with machine vision was offered. The proposed method allows measuring the volume of rock mass moved by an apron feeder using machine vision. This approach solves the volume control issue of coal produced by a feeder while working off high coal by lava complexes with release to a conveyor with accuracy applied for practical application. The developed mathematical apparatus for measuring feeder productivity in kg/s uses only basic mathematical functions such as addition, subtraction, multiplication, and division. Thus, this fact simplifies software development, and this fact expands the variety of microcontrollers and microcomputers suitable for performing tasks of calculating feeder capability. A feature of an obstacle detection issue is to correct distortions of the laser grid, which simplifies their detection. The paper presents algorithms for video camera image processing and autonomous vehicle model control based on obstacle detection machine vision systems. A sample fragment of obstacle detection at the moment of distortion with the laser grid is demonstrated.

Keywords: machine vision, machine vision operating system, light markers, measuring capability, obstacle detection system, autonomous transport

Procedia PDF Downloads 100
18826 Suppression Subtractive Hybridization Technique for Identification of the Differentially Expressed Genes

Authors: Tuhina-khatun, Mohamed Hanafi Musa, Mohd Rafii Yosup, Wong Mui Yun, Aktar-uz-Zaman, Mahbod Sahebi

Abstract:

Suppression subtractive hybridization (SSH) method is valuable tool for identifying differentially regulated genes in disease specific or tissue specific genes important for cellular growth and differentiation. It is a widely used method for separating DNA molecules that distinguish two closely related DNA samples. SSH is one of the most powerful and popular methods for generating subtracted cDNA or genomic DNA libraries. It is based primarily on a suppression polymerase chain reaction (PCR) technique and combines normalization and subtraction in a solitary procedure. The normalization step equalizes the abundance of DNA fragments within the target population, and the subtraction step excludes sequences that are common to the populations being compared. This dramatically increases the probability of obtaining low-abundance differentially expressed cDNAs or genomic DNA fragments and simplifies analysis of the subtracted library. SSH technique is applicable to many comparative and functional genetic studies for the identification of disease, developmental, tissue specific, or other differentially expressed genes, as well as for the recovery of genomic DNA fragments distinguishing the samples under comparison.

Keywords: suppression subtractive hybridization, differentially expressed genes, disease specific genes, tissue specific genes

Procedia PDF Downloads 417
18825 Fuzzy Total Factor Productivity by Credibility Theory

Authors: Shivi Agarwal, Trilok Mathur

Abstract:

This paper proposes the method to measure the total factor productivity (TFP) change by credibility theory for fuzzy input and output variables. Total factor productivity change has been widely studied with crisp input and output variables, however, in some cases, input and output data of decision-making units (DMUs) can be measured with uncertainty. These data can be represented as linguistic variable characterized by fuzzy numbers. Malmquist productivity index (MPI) is widely used to estimate the TFP change by calculating the total factor productivity of a DMU for different time periods using data envelopment analysis (DEA). The fuzzy DEA (FDEA) model is solved using the credibility theory. The results of FDEA is used to measure the TFP change for fuzzy input and output variables. Finally, numerical examples are presented to illustrate the proposed method to measure the TFP change input and output variables. The suggested methodology can be utilized for performance evaluation of DMUs and help to assess the level of integration. The methodology can also apply to rank the DMUs and can find out the DMUs that are lagging behind and make recommendations as to how they can improve their performance to bring them at par with other DMUs.

Keywords: chance-constrained programming, credibility theory, data envelopment analysis, fuzzy data, Malmquist productivity index

Procedia PDF Downloads 345
18824 Mobile App Architecture in 2023: Build Your Own Mobile App

Authors: Mounir Filali

Abstract:

Companies use many innovative ways to reach their customers to stay ahead of the competition. Along with the growing demand for innovative business solutions is the demand for new technology. The most noticeable area of demand for business innovations is the mobile application industry. Recently, companies have recognized the growing need to integrate proprietary mobile applications into their suite of services; Companies have realized that developing mobile apps gives them a competitive edge. As a result, many have begun to rapidly develop mobile apps to stay ahead of the competition. Mobile application development helps companies meet the needs of their customers. Mobile apps also help businesses to take advantage of every potential opportunity to generate leads that convert into sales. Mobile app download growth statistics with the recent rise in demand for business-related mobile apps, there has been a similar rise in the range of mobile app solutions being offered. Today, companies can use the traditional route of the software development team to build their own mobile applications. However, there are also many platform-ready "low-code and no-code" mobile apps available to choose from. These mobile app development options have more streamlined business processes. This helps them be more responsive to their customers without having to be coding experts. Companies must have a basic understanding of mobile app architecture to attract and maintain the interest of mobile app users. Mobile application architecture refers to the buildings or structural systems and design elements that make up a mobile application. It also includes the technologies, processes, and components used during application development. The underlying foundation of all applications consists of all elements of the mobile application architecture, developing a good mobile app architecture requires proper planning and strategic design. The technology framework or platform on the back end and user-facing side of a mobile application is part of the mobile architecture of the application. In-application development Software programmers loosely refer to this set of mobile architecture systems and processes as the "technology stack".

Keywords: mobile applications, development, architecture, technology

Procedia PDF Downloads 81
18823 Using Teachers' Perceptions of Science Outreach Activities to Design an 'Optimum' Model of Science Outreach

Authors: Victoria Brennan, Andrea Mallaburn, Linda Seton

Abstract:

Science outreach programmes connect school pupils with external agencies to provide activities and experiences that enhance their exposure to science. It can be argued that these programmes not only aim to support teachers with curriculum engagement and promote scientific literacy but also provide pivotal opportunities to spark scientific interest in students. In turn, a further objective of these programmes is to increase awareness of career opportunities within this field. Although outreach work is also often described as a fun and satisfying venture, a plethora of researchers express caution to how successful the processes are to increases engagement post-16 in science. When researching the impact of outreach programmes, it is often student feedback regarding the activities or enrolment numbers to particular science courses post-16, which are generated and analysed. Although this is informative, the longevity of the programme’s impact could be better informed by the teacher’s perceptions; the evidence of which is far more limited in the literature. In addition, there are strong suggestions that teachers can have an indirect impact on a student’s own self-concept. These themes shape the focus and importance of this ongoing research project as it presents the rationale that teachers are under-used resources when it comes to considering the design of science outreach programmes. Therefore, the end result of the research will consist of a presentation of an ‘optimum’ model of outreach. The result of which should be of interest to the wider stakeholders such as universities or private or government organisations who design science outreach programmes in the hope to recruit future scientists. During phase one, questionnaires (n=52) and interviews (n=8) have generated both quantitative and qualitative data. These have been analysed using the Wilcoxon non-parametric test to compare teachers’ perceptions of science outreach interventions and thematic analysis for open-ended questions. Both of these research activities provide an opportunity for a cross-section of teacher opinions of science outreach to be obtained across all educational levels. Therefore, an early draft of the ‘optimum’ model of science outreach delivery was generated using both the wealth of literature and primary data. This final (ongoing) phase aims to refine this model using teacher focus groups to provide constructive feedback about the proposed model. The analysis uses principles of modified Grounded Theory to ensure that focus group data is used to further strengthen the model. Therefore, this research uses a pragmatist approach as it aims to focus on the strengths of the different paradigms encountered to ensure the data collected will provide the most suitable information to create an improved model of sustainable outreach. The results discussed will focus on this ‘optimum’ model and teachers’ perceptions of benefits and drawbacks when it comes to engaging with science outreach work. Although the model is still a ‘work in progress’, it provides both insight into how teachers feel outreach delivery can be a sustainable intervention tool within the classroom and what providers of such programmes should consider when designing science outreach activities.

Keywords: educational partnerships, science education, science outreach, teachers

Procedia PDF Downloads 108
18822 Development of a Systematic Approach to Assess the Applicability of Silver Coated Conductive Yarn

Authors: Y. T. Chui, W. M. Au, L. Li

Abstract:

Recently, wearable electronic textiles have been emerging in today’s market and were developed rapidly since, beside the needs for the clothing uses for leisure, fashion wear and personal protection, there also exist a high demand for the clothing to be capable for function in this electronic age, such as interactive interfaces, sensual being and tangible touch, social fabric, material witness and so on. With the requirements of wearable electronic textiles to be more comfortable, adorable, and easy caring, conductive yarn becomes one of the most important fundamental elements within the wearable electronic textile for interconnection between different functional units or creating a functional unit. The properties of conductive yarns from different companies can vary to a large extent. There are vitally important criteria for selecting the conductive yarns, which may directly affect its optimization, prospect, applicability and performance of the final garment. However, according to the literature review, few researches on conductive yarns on shelf focus on the assessment methods of conductive yarns for the scientific selection of material by a systematic way under different conditions. Therefore, in this study, direction of selecting high-quality conductive yarns is given. It is to test the stability and reliability of the conductive yarns according the problems industrialists would experience with the yarns during the every manufacturing process, in which, this assessment system can be classified into four stage. That is 1) Yarn stage, 2) Fabric stage, 3) Apparel stage and 4) End user stage. Several tests with clear experiment procedures and parameters are suggested to be carried out in each stage. This assessment method suggested that the optimal conducting yarns should be stable in property and resistant to various corrosions at every production stage or during using them. It is expected that this demonstration of assessment method can serve as a pilot study that assesses the stability of Ag/nylon yarns systematically at various conditions, i.e. during mass production with textile industry procedures, and from the consumer perspective. It aims to assist industrialists to understand the qualities and properties of conductive yarns and suggesting a few important parameters that they should be reminded of for the case of higher level of suitability, precision and controllability.

Keywords: applicability, assessment method, conductive yarn, wearable electronics

Procedia PDF Downloads 522
18821 Fuzzy Expert Systems Applied to Intelligent Design of Data Centers

Authors: Mario M. Figueroa de la Cruz, Claudia I. Solorzano, Raul Acosta, Ignacio Funes

Abstract:

This technological development project seeks to create a tool that allows companies, in need of implementing a Data Center, intelligently determining factors for allocating resources support cooling and power supply (UPS) in its conception. The results should show clearly the speed, robustness and reliability of a system designed for deployment in environments where they must manage and protect large volumes of data.

Keywords: telecommunications, data center, fuzzy logic, expert systems

Procedia PDF Downloads 332
18820 Multimodal Analysis of News Magazines' Front-Page Portrayals of the US, Germany, China, and Russia

Authors: Alena Radina

Abstract:

On the global stage, national image is shaped by historical memory of wars and alliances, government ideology and particularly media stereotypes which represent countries in positive or negative ways. News magazine covers are a key site for national representation. The object of analysis in this paper is the portrayals of the US, Germany, China, and Russia in the front pages and cover stories of “Time”, “Der Spiegel”, “Beijing Review”, and “Expert”. Political comedy helps people learn about current affairs even if politics is not their area of interest, and thus satire indirectly sets the public agenda. Coupled with satirical messages, cover images and the linguistic messages embedded in the covers become persuasive visual and verbal factors, known to drive about 80% of magazine sales. Preliminary analysis identified satirical elements in magazine covers, which are known to influence and frame understandings and attract younger audiences. Multimodal and transnational comparative framing analyses lay the groundwork to investigate why journalists, editors and designers deploy certain frames rather than others. This research investigates to what degree frames used in covers correlate with frames within the cover stories and what these framings can tell us about media professionals’ representations of their own and other nations. The study sample includes 32 covers consisting of two covers representing each of the four chosen countries from the four magazines. The sampling framework considers two time periods to compare countries’ representation with two different presidents, and between men and women when present. The countries selected for analysis represent each category of the international news flows model: the core nations are the US and Germany; China is a semi-peripheral country; and Russia is peripheral. Examining textual and visual design elements on the covers and images in the cover stories reveals not only what editors believe visually attracts the reader’s attention to the magazine but also how the magazines frame and construct national images and national leaders. The cover is the most powerful editorial and design page in a magazine because images incorporate less intrusive framing tools. Thus, covers require less cognitive effort of audiences who may therefore be more likely to accept the visual frame without question. Analysis of design and linguistic elements in magazine covers helps to understand how media outlets shape their audience’s perceptions and how magazines frame global issues. While previous multimodal research of covers has focused mostly on lifestyle magazines or newspapers, this paper examines the power of current affairs magazines’ covers to shape audience perception of national image.

Keywords: framing analysis, magazine covers, multimodality, national image, satire

Procedia PDF Downloads 86
18819 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 193
18818 Mixed Number Algebra and Its Application

Authors: Md. Shah Alam

Abstract:

Mushfiq Ahmad has defined a Mixed Number, which is the sum of a scalar and a Cartesian vector. He has also defined the elementary group operations of Mixed numbers i.e. the norm of Mixed numbers, the product of two Mixed numbers, the identity element and the inverse. It has been observed that Mixed Number is consistent with Pauli matrix algebra and a handy tool to work with Dirac electron theory. Its use as a mathematical method in Physics has been studied. (1) We have applied Mixed number in Quantum Mechanics: Mixed Number version of Displacement operator, Vector differential operator, and Angular momentum operator has been developed. Mixed Number method has also been applied to Klein-Gordon equation. (2) We have applied Mixed number in Electrodynamics: Mixed Number version of Maxwell’s equation, the Electric and Magnetic field quantities and Lorentz Force has been found. (3) An associative transformation of Mixed Number numbers fulfilling Lorentz invariance requirement is developed. (4) We have applied Mixed number algebra as an extension of Complex number. Mixed numbers and the Quaternions have isomorphic correspondence, but they are different in algebraic details. The multiplication of unit Mixed number and the multiplication of unit Quaternions are different. Since Mixed Number has properties similar to those of Pauli matrix algebra, Mixed Number algebra is a more convenient tool to deal with Dirac equation.

Keywords: mixed number, special relativity, quantum mechanics, electrodynamics, pauli matrix

Procedia PDF Downloads 346
18817 Geosynthetic Tubes in Coastal Structures a Better Substitute for Shorter Planning Horizon: A Case Study

Authors: A. Pietro Rimoldi, B. Anilkumar Gopinath, C. Minimol Korulla

Abstract:

Coastal engineering structure is conventionally designed for a shorter planning horizon usually 20 years. These structures are subjected to different offshore climatic externalities like waves, tides, tsunamis etc. during the design life period. The probability of occurrence of these different offshore climatic externalities varies. The impact frequently caused by these externalities on the structures is of concern because it has a significant bearing on the capital /operating cost of the project. There can also be repeated short time occurrence of these externalities in the assumed planning horizon which can cause heavy damage to the conventional coastal structure which are mainly made of rock. A replacement of the damaged portion to prevent complete collapse is time consuming and expensive when dealing with hard rock structures. But if coastal structures are made of Geo-synthetic containment systems such replacement is quickly possible in the time period between two successive occurrences. In order to have a better knowledge and to enhance the predictive capacity of these occurrences, this study estimates risk of encounter within the design life period of various externalities based on the concept of exponential distribution. This gives an idea of the frequency of occurrences which in turn gives an indication of whether replacement is necessary and if so at what time interval such replacements have to be effected. To validate this theoretical finding, a pilot project has been taken up in the field so that the impact of the externalities can be studied both for a hard rock and a Geosynthetic tube structure. The paper brings out the salient feature of a case study which pertains to a project in which Geosynthetic tubes have been used for reformation of a seawall adjacent to a conventional rock structure in Alappuzha coast, Kerala, India. The effectiveness of the Geosystem in combatting the impact of the short-term externalities has been brought out.

Keywords: climatic externalities, exponential distribution, geosystems, planning horizon

Procedia PDF Downloads 219