Search results for: analysis and design
28677 The Development of Congeneric Elicited Writing Tasks to Capture Language Decline in Alzheimer Patients
Authors: Lise Paesen, Marielle Leijten
Abstract:
People diagnosed with probable Alzheimer disease suffer from an impairment of their language capacities; a gradual impairment which affects both their spoken and written communication. Our study aims at characterising the language decline in DAT patients with the use of congeneric elicited writing tasks. Within these tasks, a descriptive text has to be written based upon images with which the participants are confronted. A randomised set of images allows us to present the participants with a different task on every encounter, thus allowing us to avoid a recognition effect in this iterative study. This method is a revision from previous studies, in which participants were presented with a larger picture depicting an entire scene. In order to create the randomised set of images, existing pictures were adapted following strict criteria (e.g. frequency, AoA, colour, ...). The resulting data set contained 50 images, belonging to several categories (vehicles, animals, humans, and objects). A pre-test was constructed to validate the created picture set; most images had been used before in spoken picture naming tasks. Hence the same reaction times ought to be triggered in the typed picture naming task. Once validated, the effectiveness of the descriptive tasks was assessed. First, the participants (n=60 students, n=40 healthy elderly) performed a typing task, which provided information about the typing speed of each individual. Secondly, two descriptive writing tasks were carried out, one simple and one complex. The simple task contains 4 images (1 animal, 2 objects, 1 vehicle) and only contains elements with high frequency, a young AoA (<6 years), and fast reaction times. Slow reaction times, a later AoA (≥ 6 years) and low frequency were criteria for the complex task. This task uses 6 images (2 animals, 1 human, 2 objects and 1 vehicle). The data were collected with the keystroke logging programme Inputlog. Keystroke logging tools log and time stamp keystroke activity to reconstruct and describe text production processes. The data were analysed using a selection of writing process and product variables, such as general writing process measures, detailed pause analysis, linguistic analysis, and text length. As a covariate, the intrapersonal interkey transition times from the typing task were taken into account. The pre-test indicated that the new images lead to similar or even faster reaction times compared to the original images. All the images were therefore used in the main study. The produced texts of the description tasks were significantly longer compared to previous studies, providing sufficient text and process data for analyses. Preliminary analysis shows that the amount of words produced differed significantly between the healthy elderly and the students, as did the mean length of production bursts, even though both groups needed the same time to produce their texts. However, the elderly took significantly more time to produce the complex task than the simple task. Nevertheless, the amount of words per minute remained comparable between simple and complex. The pauses within and before words varied, even when taking personal typing abilities (obtained by the typing task) into account.Keywords: Alzheimer's disease, experimental design, language decline, writing process
Procedia PDF Downloads 27328676 African Culture and Youth Morality: A Critique of the On-Going Transitional Rites in Thulamela Municipality, South Africa
Authors: Bassey Rofem Inyang, Matshidze Pfarelo, Mabale Dolphin
Abstract:
Using a qualitative descriptive design, this study established the consequences of the on-going transitional rites on youth morality in the Thulamela Local Municipality, South Africa. The participants were sampled using a non-random sampling procedure, specifically, a purposive sampling technique and a snowball sampling technique. A semi-structured interview guide was recruited to collect data from the Indigenous Knowledge (IK) custodians, the parents of the youths and the youths until the point of saturation. The analysis was performed using a thematic content method. With the emergence of themes and sub-themes, broad categories were generated to differentiate and explain the thoughts expressed by the various respondents and the observations made in the field. The study findings suggest that the on-going transitional rites are depicted by weekend social activities with the practice of substance use and abuse among the youths at recreational spots. The transitional rites are structured under the guise of “freaks” as an evolving culture among the youths. The freaks culture is a counterculture of the usual initiation schools for transitional rites of passage which is believed to instill morality among youths. The findings comprehensively show that the on-going transitional rites influence inappropriate youth morality. This study concluded that the on-going transitional rites activities and practices evolved as a current socialization standard for quick maturity status; as a result, it will be challenging to provide a complete turnaround of this evolving culture. The study, however, recommends building on the exciting transitional rites of passage to moderate appropriate youths’ morality in Thulamela communities.Keywords: morality, transitional rites, youths, behaviour
Procedia PDF Downloads 9228675 Identification of Hepatocellular Carcinoma Using Supervised Learning Algorithms
Authors: Sagri Sharma
Abstract:
Analysis of diseases integrating multi-factors increases the complexity of the problem and therefore, development of frameworks for the analysis of diseases is an issue that is currently a topic of intense research. Due to the inter-dependence of the various parameters, the use of traditional methodologies has not been very effective. Consequently, newer methodologies are being sought to deal with the problem. Supervised Learning Algorithms are commonly used for performing the prediction on previously unseen data. These algorithms are commonly used for applications in fields ranging from image analysis to protein structure and function prediction and they get trained using a known dataset to come up with a predictor model that generates reasonable predictions for the response to new data. Gene expression profiles generated by DNA analysis experiments can be quite complex since these experiments can involve hypotheses involving entire genomes. The application of well-known machine learning algorithm - Support Vector Machine - to analyze the expression levels of thousands of genes simultaneously in a timely, automated and cost effective way is thus used. The objectives to undertake the presented work are development of a methodology to identify genes relevant to Hepatocellular Carcinoma (HCC) from gene expression dataset utilizing supervised learning algorithms and statistical evaluations along with development of a predictive framework that can perform classification tasks on new, unseen data.Keywords: artificial intelligence, biomarker, gene expression datasets, hepatocellular carcinoma, machine learning, supervised learning algorithms, support vector machine
Procedia PDF Downloads 42828674 Dosimetric Analysis of Intensity Modulated Radiotherapy versus 3D Conformal Radiotherapy in Adult Primary Brain Tumors: Regional Cancer Centre, India
Authors: Ravi Kiran Pothamsetty, Radha Rani Ghosh, Baby Paul Thaliath
Abstract:
Radiation therapy has undergone many advancements and evloved from 2D to 3D. Recently, with rapid pace of drug discoveries, cutting edge technology, and clinical trials has made innovative advancements in computer technology and treatment planning and upgraded to intensity modulated radiotherapy (IMRT) which delivers in homogenous dose to tumor and normal tissues. The present study was a hospital-based experience comparing two different conformal radiotherapy techniques for brain tumors. This analytical study design has been conducted at Regional Cancer Centre, India from January 2014 to January 2015. Ten patients have been selected after inclusion and exclusion criteria. All the patients were treated on Artiste Siemens Linac Accelerator. The tolerance level for maximum dose was 6.0 Gyfor lenses and 54.0 Gy for brain stem, optic chiasm and optical nerves as per RTOG criteria. Mean and standard deviation values of PTV98%, PTV 95% and PTV 2% in IMRT were 93.16±2.9, 95.01±3.4 and 103.1±1.1 respectively; for 3DCRT were 91.4±4.7, 94.17±2.6 and 102.7±0.39 respectively. PTV max dose (%) in IMRT and 3D-CRT were 104.7±0.96 and 103.9±1.0 respectively. Maximum dose to the tumor can be delivered with IMRT with acceptable toxicity limits. Variables such as expertise, location of tumor, patient condition, and TPS influence the outcome of the treatment.Keywords: brain tumors, intensity modulated radiotherapy (IMRT), three dimensional conformal radiotherapy (3D-CRT), radiation therapy oncology group (RTOG)
Procedia PDF Downloads 23728673 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach
Authors: Elias K. Maragos, Petros E. Maravelakis
Abstract:
In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs
Procedia PDF Downloads 15928672 Evaluation of Virtual Reality for the Rehabilitation of Athlete Lower Limb Musculoskeletal Injury: A Method for Obtaining Practitioner’s Viewpoints through Observation and Interview
Authors: Hannah K. M. Tang, Muhammad Ateeq, Mark J. Lake, Badr Abdullah, Frederic A. Bezombes
Abstract:
Based on a theoretical assessment of current literature, virtual reality (VR) could help to treat sporting injuries in a number of ways. However, it is important to obtain rehabilitation specialists’ perspectives in order to design, develop and validate suitable content for a VR application focused on treatment. Subsequently, a one-day observation and interview study focused on the use of VR for the treatment of lower limb musculoskeletal conditions in athletes was conducted at St George’s Park England National Football Centre with rehabilitation specialists. The current paper established the methods suitable for obtaining practitioner’s viewpoints through observation and interview in this context. Particular detail was provided regarding the method of qualitatively processing interview results using the qualitative data analysis software tool NVivo, in order to produce a narrative of overarching themes. The observations and overarching themes identified could be used as a framework and success criteria of a VR application developed in future research. In conclusion, this work explained the methods deemed suitable for obtaining practitioner’s viewpoints through observation and interview. This was required in order to highlight characteristics and features of a VR application designed to treat lower limb musculoskeletal injury of athletes and could be built upon to direct future work.Keywords: athletes, lower-limb musculoskeletal injury, rehabilitation, return-to-sport, virtual reality
Procedia PDF Downloads 25528671 A Knowledge-Based Development of Risk Management Approaches for Construction Projects
Authors: Masoud Ghahvechi Pour
Abstract:
Risk management is a systematic and regular process of identifying, analyzing and responding to risks throughout the project's life cycle in order to achieve the optimal level of elimination, reduction or control of risk. The purpose of project risk management is to increase the probability and effect of positive events and reduce the probability and effect of unpleasant events on the project. Risk management is one of the most fundamental parts of project management, so that unmanaged or untransmitted risks can be one of the primary factors of failure in a project. Effective risk management does not apply to risk regression, which is apparently the cheapest option of the activity. However, the main problem with this option is the economic sensitivity, because what is potentially profitable is by definition risky, and what does not pose a risk is economically interesting and does not bring tangible benefits. Therefore, in relation to the implemented project, effective risk management is finding a "middle ground" in its management, which includes, on the one hand, protection against risk from a negative direction by means of accurate identification and classification of risk, which leads to analysis And it becomes a comprehensive analysis. On the other hand, management using all mathematical and analytical tools should be based on checking the maximum benefits of these decisions. Detailed analysis, taking into account all aspects of the company, including stakeholder analysis, will allow us to add what will become tangible benefits for our project in the future to effective risk management. Identifying the risk of the project is based on the theory that which type of risk may affect the project, and also refers to specific parameters and estimating the probability of their occurrence in the project. These conditions can be divided into three groups: certainty, uncertainty, and risk, which in turn support three types of investment: risk preference, risk neutrality, specific risk deviation, and its measurement. The result of risk identification and project analysis is a list of events that indicate the cause and probability of an event, and a final assessment of its impact on the environment.Keywords: risk, management, knowledge, risk management
Procedia PDF Downloads 6528670 Quantum Confinement in LEEH Capped CdS Nanocrystalline
Authors: Mihir Hota, Namita Jena, S. N. Sahu
Abstract:
LEEH (L-cysteine ethyl ester hydrochloride) capped CdS semiconductor nanocrystals are grown at 800C using a simple chemical route. Photoluminescence (PL), Optical absorption (UV) and Transmission Electron Microscopy (TEM) have been carried out to evaluate the structural and optical properties of the nanocrystal. Optical absorption studies have been carried out to optimize the sample. XRD and TEM analysis shows that the nanocrystal belongs to FCC structure having average size of 3nm while a bandgap of 2.84eV is estimated from Photoluminescence analysis. The nanocrystal emits bluish light when excited with 355nm LASER.Keywords: cadmium sulphide, nanostructures, luminescence, optical properties
Procedia PDF Downloads 39528669 Comparative Analysis of the Expansion Rate and Soil Erodibility Factor (K) of Some Gullies in Nnewi and Nnobi, Anambra State Southeastern Nigeria
Authors: Nzereogu Stella Kosi, Igwe Ogbonnaya, Emeh Chukwuebuka Odinaka
Abstract:
A comparative analysis of the expansion rate and soil erodibility of some gullies in Nnewi and Nnobi both of Nanka Formation were studied. The study involved an integration of field observations, geotechnical analysis, slope stability analysis, multivariate statistical analysis, gully expansion rate analysis, and determination of the soil erodibility factor (K) from Revised Universal Soil Loss Equation (RUSLE). Fifteen representative gullies were studied extensively, and results reveal that the geotechnical properties of the soil, topography, vegetation cover, rainfall intensity, and the anthropogenic activities in the study area were major factors propagating and influencing the erodibility of the soils. The specific gravity of the soils ranged from 2.45-2.66 and 2.54-2.78 for Nnewi and Nnobi, respectively. Grain size distribution analysis revealed that the soils are composed of gravel (5.77-17.67%), sand (79.90-91.01%), and fines (2.36-4.05%) for Nnewi and gravel (7.01-13.65%), sand (82.47-88.67%), and fines (3.78-5.02%) for Nnobi. The soils are moderately permeable with values ranging from 2.92 x 10-5 - 6.80 x 10-4 m/sec and 2.35 x 10-6 - 3.84 x 10⁻⁴m/sec for Nnewi and Nnobi respectively. All have low cohesion values ranging from 1–5kPa and 2-5kPa and internal friction angle ranging from 29-38° and 30-34° for Nnewi and Nnobi, respectively, which suggests that the soils have low shear strength and are susceptible to shear failure. Furthermore, the compaction test revealed that the soils were loose and easily erodible with values of maximum dry density (MDD) and optimum moisture content (OMC) ranging from 1.82-2.11g/cm³ and 8.20-17.81% for Nnewi and 1.98-2.13g/cm³ and 6.00-17.80% respectively. The plasticity index (PI) of the fines showed that they are nonplastic to low plastic soils and highly liquefiable with values ranging from 0-10% and 0-9% for Nnewi and Nnobi, respectively. Multivariate statistical analyses were used to establish relationship among the determined parameters. Slope stability analysis gave factor of safety (FoS) values in the range of 0.50-0.76 and 0.82-0.95 for saturated condition and 0.73-0.98 and 0.87-1.04 for unsaturated condition for both Nnewi and Nnobi, respectively indicating that the slopes are generally unstable to critically stable. The erosion expansion rate analysis for a fifteen-year period (2005-2020) revealed an average longitudinal expansion rate of 36.05m/yr, 10.76m/yr, and 183m/yr for Nnewi, Nnobi, and Nanka type gullies, respectively. The soil erodibility factor (K) are 8.57x10⁻² and 1.62x10-4 for Nnewi and Nnobi, respectively, indicating that the soils in Nnewi have higher erodibility potentials than those of Nnobi. From the study, both the Nnewi and Nnobi areas are highly prone to erosion. However, based on the relatively lower fine content of the soil, relatively lower topography, steeper slope angle, and sparsely vegetated terrain in Nnewi, soil erodibility and gully intensity are more profound in Nnewi than Nnobi.Keywords: soil erodibility, gully expansion, nnewi-nnobi, slope stability, factor of safety
Procedia PDF Downloads 12728668 Investigation of Genetic Diversity of Tilia tomentosa Moench. (Silver Lime) in Duzce-Turkey
Authors: Ibrahim Ilker Ozyigit, Ertugrul Filiz, Seda Birbilener, Semsettin Kulac, Zeki Severoglu
Abstract:
In this study, we have performed genetic diversity analysis of Tilia tomentosa genotypes by using randomly amplified polymorphic DNA (RAPD) primers. A total of 28 genotypes, including 25 members from the urban ecosystem and 3 genotypes from forest ecosystem as outgroup were used. 8 RAPD primers produced a total of 53 bands, of which 48 (90.6 %) were polymorphic. Percentage of polymorphic loci (P), observed number of alleles (Na), effective number of alleles (Ne), Nei's (1973) gene diversity (h), and Shannon's information index (I) were found as 94.29 %, 1.94, 1.60, 0.34, and 0.50, respectively. The unweighted pair-group method with arithmetic average (UPGMA) cluster analysis revealed that two major groups were observed. The genotypes of urban and forest ecosystems showed a high genetic similarity between 28% and 92% and these genotypes did not separate from each other in UPGMA tree. Also, urban and forest genotypes clustered together in principal component analysis (PCA).Keywords: Tilia tomentosa, genetic diversity, urban ecosystem, RAPD, UPGMA
Procedia PDF Downloads 50828667 Flow-Induced Vibration Marine Current Energy Harvesting Using a Symmetrical Balanced Pair of Pivoted Cylinders
Authors: Brad Stappenbelt
Abstract:
The phenomenon of vortex-induced vibration (VIV) for elastically restrained cylindrical structures in cross-flows is relatively well investigated. The utility of this mechanism in harvesting energy from marine current and tidal flows is however arguably still in its infancy. With relatively few moving components, a flow-induced vibration-based energy conversion device augers low complexity compared to the commonly employed turbine design. Despite the interest in this concept, a practical device has yet to emerge. It is desirable for optimal system performance to design for a very low mass or mass moment of inertia ratio. The device operating range, in particular, is maximized below the vortex-induced vibration critical point where an infinite resonant response region is realized. An unfortunate consequence of this requirement is large buoyancy forces that need to be mitigated by gravity-based, suction-caisson or anchor mooring systems. The focus of this paper is the testing of a novel VIV marine current energy harvesting configuration that utilizes a symmetrical and balanced pair of horizontal pivoted cylinders. The results of several years of experimental investigation, utilizing the University of Wollongong fluid mechanics laboratory towing tank, are analyzed and presented. A reduced velocity test range of 0 to 60 was covered across a large array of device configurations. In particular, power take-off damping ratios spanning from 0.044 to critical damping were examined in order to determine the optimal conditions and hence the maximum device energy conversion efficiency. The experiments conducted revealed acceptable energy conversion efficiencies of around 16% and desirable low flow-speed operating ranges when compared to traditional turbine technology. The potentially out-of-phase spanwise VIV cells on each arm of the device synchronized naturally as no decrease in amplitude response and comparable energy conversion efficiencies to the single cylinder arrangement were observed. In addition to the spatial design benefits related to the horizontal device orientation, the main advantage demonstrated by the current symmetrical horizontal configuration is to allow large velocity range resonant response conditions without the excessive buoyancy. The novel configuration proposed shows clear promise in overcoming many of the practical implementation issues related to flow-induced vibration marine current energy harvesting.Keywords: flow-induced vibration, vortex-induced vibration, energy harvesting, tidal energy
Procedia PDF Downloads 14528666 Organotin (IV) Based Complexes as Promiscuous Antibacterials: Synthesis in vitro, in Silico Pharmacokinetic, and Docking Studies
Authors: Wajid Rehman, Sirajul Haq, Bakhtiar Muhammad, Syed Fahad Hassan, Amin Badshah, Muhammad Waseem, Fazal Rahim, Obaid-Ur-Rahman Abid, Farzana Latif Ansari, Umer Rashid
Abstract:
Five novel triorganotin (IV) compounds have been synthesized and characterized. The tin atom is penta-coordinated to assume trigonal-bipyramidal geometry. Using in silico derived parameters; the objective of our study is to design and synthesize promiscuous antibacterials potent enough to combat resistance. Among various synthesized organotin (IV) complexes, compound 5 was found as potent antibacterial agent against various bacterial strains. Further lead optimization of drug-like properties was evaluated through in silico predictions. Data mining and computational analysis were utilized to derive compound promiscuity phenomenon to avoid drug attrition rate in designing antibacterials. Xanthine oxidase and human glucose- 6-phosphatase were found as only true positive off-target hits by ChEMBL database and others utilizing similarity ensemble approach. Propensity towards a-3 receptor, human macrophage migration factor and thiazolidinedione were found as false positive off targets with E-value 1/4> 10^-4 for compound 1, 3, and 4. Further, displaying positive drug-drug interaction of compound 1 as uricosuric was validated by all databases and docked protein targets with sequence similarity and compositional matrix alignment via BLAST software. Promiscuity of the compound 5 was further confirmed by in silico binding to different antibacterial targets.Keywords: antibacterial activity, drug promiscuity, ADMET prediction, metallo-pharmaceutical, antimicrobial resistance
Procedia PDF Downloads 50128665 The Evaluation of the Cognitive Training Program for Older Adults with Mild Cognitive Impairment: Protocol of a Randomized Controlled Study
Authors: Hui-Ling Yang, Kuei-Ru Chou
Abstract:
Background: Studies show that cognitive training can effectively delay cognitive failure. However, there are several gaps in the previous studies of cognitive training in mild cognitive impairment: 1) previous studies enrolled mostly healthy older adults, with few recruiting older adults with cognitive impairment; 2) they also had limited generalizability and lacked long-term follow-up data and measurements of the activities of daily living functional impact. Moreover, only 37% were randomized controlled trials (RCT). 3) Limited cognitive training has been specifically developed for mild cognitive impairment. Objective: This study sought to investigate the changes in cognitive function, activities of daily living and degree of depressive symptoms in older adults with mild cognitive impairment after cognitive training. Methods: This double-blind randomized controlled study has a 2-arm parallel group design. Study subjects are older adults diagnosed with mild cognitive impairment in residential care facilities. 124 subjects will be randomized by the permuted block randomization, into intervention group (Cognitive training, CT), or active control group (Passive information activities, PIA). Therapeutic adherence, sample attrition rate, medication compliance and adverse events will be monitored during the study period, and missing data analyzed using intent-to-treat analysis (ITT). Results: Training sessions of the CT group are 45 minutes/day, 3 days/week, for 12 weeks (36 sessions each). The training of active control group is the same as CT group (45min/day, 3days/week, for 12 weeks, for a total of 36 sessions). The primary outcome is cognitive function, using the Mini-Mental Status Examination (MMSE); the secondary outcome indicators are: 1) activities of daily living, using the Lawton’s Instrumental Activities of Daily Living (IADLs) and 2) degree of depressive symptoms, using the Geriatric Depression Scale-Short form (GDS-SF). Latent growth curve modeling will be used in the repeated measures statistical analysis to estimate the trajectory of improvement by examining the rate and pattern of change in cognitive functions, activities of daily living and degree of depressive symptoms for intervention efficacy over time, and the effects will be evaluated immediate post-test, 3 months, 6 months and one year after the last session. Conclusions: We constructed a rigorous CT program adhering to the Consolidated Standards of Reporting Trials (CONSORT) reporting guidelines. We expect to determine the improvement in cognitive function, activities of daily living and degree of depressive symptoms of older adults with mild cognitive impairment after using the CT.Keywords: mild cognitive impairment, cognitive training, randomized controlled study
Procedia PDF Downloads 44728664 The Reenactment of Historic Memory and the Ways to Read past Traces through Contemporary Architecture in European Urban Contexts: The Case Study of the Medieval Walls of Naples
Authors: Francesco Scarpati
Abstract:
Because of their long history, ranging from ancient times to the present day, European cities feature many historical layers, whose single identities are represented by traces surviving in the urban design. However, urban transformations, in particular, the ones that have been produced by the property speculation phenomena of the 20th century, often compromised the readability of these traces, resulting in a loss of the historical identities of the single layers. The purpose of this research is, therefore, a reflection on the theme of the reenactment of the historical memory in the stratified European contexts and on how contemporary architecture can help to reveal past signs of the cities. The research work starts from an analysis of a series of emblematic examples that have already provided an original solution to the described problem, going from the architectural detail scale to the urban and landscape scale. The results of these analyses are then applied to the case study of the city of Naples, as an emblematic example of a stratified city, with an ancient Greek origin; a city where it is possible to read most of the traces of its transformations. Particular consideration is given to the trace of the medieval walls of the city, which a long time ago clearly divided the city itself from the outer fields, and that is no longer readable at the current time. Finally, solutions and methods of intervention are proposed to ensure that the trace of the walls, read as a boundary, can be revealed through the contemporary project.Keywords: contemporary project, historic memory, historic urban contexts, medieval walls, naples, stratified cities, urban traces
Procedia PDF Downloads 26328663 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova
Abstract:
The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.Keywords: bacteriocins, cross-contamination, mathematical model, temperature
Procedia PDF Downloads 14328662 Effects of Variation of Centers in the Torsional Analysis of Asymmetrical Buildings by Performing Non Linear Static Analysis
Authors: Md Masihuddin Siddiqui, Abdul Haakim Mohammed
Abstract:
Earthquakes are the most unpredictable and devastating of all natural disasters. The behaviour of a building during an earthquake depends on several factors such as stiffness, adequate lateral strength, ductility, and configurations. The experience from the performance of buildings during past earthquakes has shown that the buildings with regular geometry, uniformly distributed mass and stiffness in plan as well as in elevation suffer much less damage compared to irregular configurations. The three centers namely- centre of mass, centre of strength, centre of stiffness are the torsional parameters which contribute to the strength of the building in case of an earthquake. Inertial forces and resistive forces in a structural system act through the center of mass and center of rigidity respectively which together oppose the forces that are produced during seismic excitation. So these centers of a structural system should be positioned where the structural system is the strongest so that the effects produced due to the earthquake may have a minimal effect on the structure. In this paper, the effects of variation of strength eccentricity and stiffness eccentricity in reducing the torsional responses of the asymmetrical buildings by using pushover analysis are studied. The maximum reduction of base torsion was observed in the case of minimum strength eccentricity, and the least reduction was observed in the case of minimum stiffness eccentricity.Keywords: strength eccentricity, stiffness eccentricity, asymmetric structure, base torsion, push over analysis
Procedia PDF Downloads 29228661 Climate Change and Tourism: A Scientometric Analysis Using Citespace
Authors: Yan Fang, Jie Yin, Bihu Wu
Abstract:
The interaction between climate change and tourism is one of the most promising research areas of recent decades. In this paper, a scientometric analysis of 976 academic publications between 1990 and 2015 related to climate change and tourism is presented in order to characterize the intellectual landscape by identifying and visualizing the evolution of the collaboration network, the co-citation network, and emerging trends of citation burst and keyword co-occurrence. The results show that the number of publications in this field has increased rapidly and it has become an interdisciplinary and multidisciplinary topic. The research areas are dominated by Australia, USA, Canada, New Zealand, and European countries, which have the most productive authors and institutions. The hot topics of climate change and tourism research in recent years are further identified, including the consequences of climate change for tourism, necessary adaptations, the vulnerability of the tourism industry, tourist behaviour and demand in response to climate change, and emission reductions in the tourism sector. The work includes an in-depth analysis of a major forum of climate change and tourism to help readers to better understand global trends in this field in the past 25 years.Keywords: climate change, tourism, scientometrics, CiteSpace
Procedia PDF Downloads 41128660 The Representations of Protesters in the UK National Daily Press: Pro- And Anti- Brexit Demonstrations 2016-2019
Authors: Charlotte-Rose Kennedy
Abstract:
In a political climate divided by Brexit, it is crucial to be critical of the press, as it is the apparatus which political authorities use to impose their laws and shape public opinion. Although large protests have the power to shake and disrupt policy-making by making it difficult for governments to ignore their goals, the British press historically constructs protesters as delegitimate, deviant, and criminal, which could limit protests’ credibility and democratic power. This paper explores how the remain supporting daily UK press (The Mirror, Financial Times, The Independent, The Guardian) and the leave supporting daily UK press (The Daily Mail, The Daily Star, The Sun, The Express, The Telegraph) discursively constructed every pro- and anti-Brexit demonstration from 2016 to 2019. 702 instances of the terms ‘protester’, ‘protesters’, ‘protestor’ and ‘protestors’ were analyzed through both transitivity analysis and critical discourse analysis. This mixed-methods approach allowed for the analysis of how the UK press perpetuated and upheld social ideologies about protests through their specific grammatical and language choices. The results of this analysis found that both remain and leave supporting press utilized the same discourses to report on protests they oppose and protests they support. For example, the remain backing The Mirror used water metaphors regularly associated with influxes of refugees and asylum seekers to support the protesters on the remain protest ‘Final Say’, and oppose the protesters on the leave protest ‘March to Leave’. Discourses of war, violence, and victimhood are also taken on by both sides of the press Brexit debate and are again used to support and oppose the same arguments. Finally, the paper concludes that these analogous discourses do nothing to help the already marginalized social positions of protesters in the UK and could potentially lead to reduced public support for demonstrations. This could, in turn, facilitate the government in introducing increasingly restrictive legislation in relation to freedom of assembly rights, which could be detrimental to British democracy.Keywords: Brexit, critical discourse analysis, protests, transitivity analysis, UK press
Procedia PDF Downloads 17928659 Prediction of California Bearing Ratio of a Black Cotton Soil Stabilized with Waste Glass and Eggshell Powder using Artificial Neural Network
Authors: Biruhi Tesfaye, Avinash M. Potdar
Abstract:
The laboratory test process to determine the California bearing ratio (CBR) of black cotton soils is not only overpriced but also time-consuming as well. Hence advanced prediction of CBR plays a significant role as it is applicable In pavement design. The prediction of CBR of treated soil was executed by Artificial Neural Networks (ANNs) which is a Computational tool based on the properties of the biological neural system. To observe CBR values, combined eggshell and waste glass was added to soil as 4, 8, 12, and 16 % of the weights of the soil samples. Accordingly, the laboratory related tests were conducted to get the required best model. The maximum CBR value found at 5.8 at 8 % of eggshell waste glass powder addition. The model was developed using CBR as an output layer variable. CBR was considered as a function of the joint effect of liquid limit, plastic limit, and plastic index, optimum moisture content and maximum dry density. The best model that has been found was ANN with 5, 6 and 1 neurons in the input, hidden and output layer correspondingly. The performance of selected ANN has been 0.99996, 4.44E-05, 0.00353 and 0.0067 which are correlation coefficient (R), mean square error (MSE), mean absolute error (MAE) and root mean square error (RMSE) respectively. The research presented or summarized above throws light on future scope on stabilization with waste glass combined with different percentages of eggshell that leads to the economical design of CBR acceptable to pavement sub-base or base, as desired.Keywords: CBR, artificial neural network, liquid limit, plastic limit, maximum dry density, OMC
Procedia PDF Downloads 19028658 Syntactic Ambiguity and Syntactic Analysis: Transformational Grammar Approach
Authors: Olufemi Olupe
Abstract:
Within linguistics, various approaches have been adopted to the study of language. One of such approaches is the syntax. The syntax is an aspect of the grammar of the language which deals with how words are put together to form phrases and sentences and how such structures are interpreted in language. Ambiguity, which is also germane in this discourse is about the uncertainty of meaning as a result of the possibility of a phrase or sentence being understood and interpreted in more than one way. In the light of the above, this paper attempts a syntactic study of syntactic ambiguities in The English Language, using the Transformational Generative Grammar (TGG) Approach. In doing this, phrases and sentences were raised with each description followed by relevant analysis. Finding in the work reveals that ambiguity cannot always be disambiguated by the means of syntactic analysis alone without recourse to semantic interpretation. The further finding shows that some syntactical ambiguities structures cannot be analysed on two surface structures in spite of the fact that there are more than one deep structures. The paper concludes that in as much as ambiguity remains in language; it will continue to pose a problem of understanding to a second language learner. Users of English as a second language, must, however, make a conscious effort to avoid its usage to achieve effective communication.Keywords: language, syntax, semantics, morphology, ambiguity
Procedia PDF Downloads 39228657 High Throughput Virtual Screening against ns3 Helicase of Japanese Encephalitis Virus (JEV)
Authors: Soma Banerjee, Aamen Talukdar, Argha Mandal, Dipankar Chaudhuri
Abstract:
Japanese Encephalitis is a major infectious disease with nearly half the world’s population living in areas where it is prevalent. Currently, treatment for it involves only supportive care and symptom management through vaccination. Due to the lack of antiviral drugs against Japanese Encephalitis Virus (JEV), the quest for such agents remains a priority. For these reasons, simulation studies of drug targets against JEV are important. Towards this purpose, docking experiments of the kinase inhibitors were done against the chosen target NS3 helicase as it is a nucleoside binding protein. Previous efforts regarding computational drug design against JEV revealed some lead molecules by virtual screening using public domain software. To be more specific and accurate regarding finding leads, in this study a proprietary software Schrödinger-GLIDE has been used. Druggability of the pockets in the NS3 helicase crystal structure was first calculated by SITEMAP. Then the sites were screened according to compatibility with ATP. The site which is most compatible with ATP was selected as target. Virtual screening was performed by acquiring ligands from databases: KinaseSARfari, KinaseKnowledgebase and Published inhibitor Set using GLIDE. The 25 ligands with best docking scores from each database were re-docked in XP mode. Protein structure alignment of NS3 was performed using VAST against MMDB, and similar human proteins were docked to all the best scoring ligands. The low scoring ligands were chosen for further studies and the high scoring ligands were screened. Seventy-three ligands were listed as the best scoring ones after performing HTVS. Protein structure alignment of NS3 revealed 3 human proteins with RMSD values lesser than 2Å. Docking results with these three proteins revealed the inhibitors that can interfere and inhibit human proteins. Those inhibitors were screened. Among the ones left, those with docking scores worse than a threshold value were also removed to get the final hits. Analysis of the docked complexes through 2D interaction diagrams revealed the amino acid residues that are essential for ligand binding within the active site. Interaction analysis will help to find a strongly interacting scaffold among the hits. This experiment yielded 21 hits with the best docking scores which could be investigated further for their drug like properties. Aside from getting suitable leads, specific NS3 helicase-inhibitor interactions were identified. Selection of Target modification strategies complementing docking methodologies which can result in choosing better lead compounds are in progress. Those enhanced leads can lead to better in vitro testing.Keywords: antivirals, docking, glide, high-throughput virtual screening, Japanese encephalitis, ns3 helicase
Procedia PDF Downloads 23028656 The Environmental and Economic Analysis of Extended Input-Output Table for Thailand’s Biomass Pellet Industry
Authors: Prangvalai Buasan, Boonrod Sajjakulnukit, Thongchart Bowonthumrongchai
Abstract:
The demand for biomass pellets in the industrial sector has significantly increased since 2020. The revised version of Thailand’s power development plan as well as the Alternative Energy Development Plan, aims to promote biomass fuel consumption by around 485 MW by 2030. The replacement of solid fossil fuel with biomass pellets will affect medium-term and long-term national benefits for all industries throughout the supply chain. Therefore, the evaluation of environmental and economic impacts throughout the biomass pellet supply chain needs to be performed to provide better insight into the goods and financial flow of this activity. This study extended the national input-output table for the biomass pellet industry and applied the input-output analysis (IOA) method, a sort of macroeconomic analysis, to interpret the result of transactions between industries in the monetary unit when the revised national power development plan was adopted and enforced. Greenhouse gas emissions from consuming energy and raw material through the supply chain are also evaluated. The total intermediate transactions of all economic sectors, which included the biomass pellets sector (CASE 2), increased by 0.02% when compared with the conservative case (CASE 1). The control total, which is the sum of total intermediate transactions and value-added, the control total of CASE 2 is increased by 0.07% when compared with CASE 1. The pellet production process emitted 432.26 MtCO2e per year. The major sharing of the GHG is from the plantation process of raw biomass.Keywords: input-output analysis, environmental extended input-output analysis, macroeconomic planning, biomass pellets, renewable energy
Procedia PDF Downloads 9828655 The Effect of Artificial Intelligence on Civil Engineering Outputs and Designs
Authors: Mina Youssef Makram Ibrahim
Abstract:
Engineering identity contributes to the professional and academic sustainability of female engineers. Recognizability is an important factor that shapes an engineer's identity. People who are deprived of real recognition often fail to create a positive identity. This study draws on Hornet’s recognition theory to identify factors that influence female civil engineers' sense of recognition. Over the past decade, a survey was created and distributed to 330 graduate students in the Department of Civil, Civil and Environmental Engineering at Iowa State University. Survey items include demographics, perceptions of a civil engineer's identity, and factors that influence recognition of a civil engineer's identity, such as B. Opinions about society and family. Descriptive analysis of survey responses revealed that perceptions of civil engineering varied significantly. The definitions of civil engineering provided by participants included the terms structure, design and infrastructure. Almost half of the participants said the main reason for studying Civil Engineering was their interest in the subject, and the majority said they were proud to be a civil engineer. Many study participants reported that their parents viewed them as civil engineers. Institutional and operational treatment was also found to have a significant impact on the recognition of women civil engineers. Almost half of the participants reported feeling isolated or ignored at work because of their gender. This research highlights the importance of recognition in developing the identity of women engineers.Keywords: civil service, hiring, merit, policing civil engineering, construction, surveying, mapping, pile civil service, Kazakhstan, modernization, a national model of civil service, civil service reforms, bureaucracy civil engineering, gender, identity, recognition
Procedia PDF Downloads 6128654 Designing Next Generation Platforms for Recombinant Protein Production by Genome Engineering of Escherichia coli
Authors: Priyanka Jain, Ashish K. Sharma, Esha Shukla, K. J. Mukherjee
Abstract:
We propose a paradigm shift in our approach to design improved platforms for recombinant protein production, by addressing system level issues rather than the individual steps associated with recombinant protein synthesis like transcription, translation, etc. We demonstrate that by controlling and modulating the cellular stress response (CSR), which is responsible for feedback control of protein synthesis, we can generate hyper-producing strains. We did transcriptomic profiling of post-induction cultures, expressing different types of protein, to analyze the nature of this cellular stress response. We found significant down-regulation of substrate utilization, translation, and energy metabolism genes due to generation CSR inside the host cell. However, transcription profiling has also shown that many genes are up-regulated post induction and their role in modulating the CSR is unclear. We hypothesized that these up-regulated genes trigger signaling pathways, generating the CSR and concomitantly reduce the recombinant protein yield. To test this hypothesis, we knocked out the up-regulated genes, which did not have any downstream regulatees, and analyzed their impact on cellular health and recombinant protein expression. Two model proteins i.e., GFP and L-Asparaginase were chosen for this analysis. We observed a significant improvement in expression levels, with some knock-outs showing more than 7-fold higher expression compared to control. The 10 best single knock-outs were chosen to make 45 combinations of all possible double knock-outs. A further increase in expression was observed in some of these double knock- outs with GFP levels being highest in a double knock-out ΔyhbC + ΔelaA. However, for L-Asparaginase which is a secretory protein, the best results were obtained using a combination of ΔelaA+ΔcysW knock-outs. We then tested all the knock outs for their ability to enhance the expression of a 'difficult-to-express' protein. The Rubella virus E1 protein was chosen and tagged with sfGFP at the C-terminal using a linker peptide for easy online monitoring of expression of this fusion protein. Interestingly, the highest increase in Rubella-sGFP levels was obtained in the same double knock-out ΔelaA + ΔcysW (5.6 fold increase in expression yield compared to the control) which gave the highest expression for L-Asparaginase. However, for sfGFP alone, the ΔyhbC+ΔmarR knock-out gave the highest level of expression. These results indicate that there is a fair degree of commonality in the nature of the CSR generated by the induction of different proteins. Transcriptomic profiling of the double knock out showed that many genes associated with the translational machinery and energy biosynthesis did not get down-regulated post induction, unlike the control where these genes were significantly down-regulated. This confirmed our hypothesis of these genes playing an important role in the generation of the CSR and allowed us to design a strategy for making better expression hosts by simply knocking out key genes. This strategy is radically superior to the previous approach of individually up-regulating critical genes since it blocks the mounting of the CSR thus preventing the down-regulation of a very large number of genes responsible for sustaining the flux through the recombinant protein production pathway.Keywords: cellular stress response, GFP, knock-outs, up-regulated genes
Procedia PDF Downloads 22528653 Toward the Destigmatizing the Autism Label: Conceptualizing Celebratory Technologies
Authors: LouAnne Boyd
Abstract:
From the perspective of self-advocates, the biggest unaddressed problem is not the symptoms of an autism spectrum diagnosis but the social stigma that accompanies autism. This societal perspective is in contrast to the focus on the majority of interventions. Autism interventions, and consequently, most innovative technologies for autism, aim to improve deficits that occur within the person. For example, the most common Human-Computer Interaction research projects in assistive technology for autism target social skills from a normative perspective. The premise of the autism technologies is that difficulties occur inside the body, hence, the medical model focuses on ways to improve the ailment within the person. However, other technological approaches to support people with autism do exist. In the realm of Human Computer Interaction, there are other modes of research that provide critique of the medical model. For example, critical design, whose intended audience is industry or other HCI researchers, provides products that are the opposite of interventionist work to bring attention to the misalignment between the lived experience and the societal perception of autism. For example, parodies of interventionist work exist to provoke change, such as a recent project called Facesavr, a face covering that helps allistic adults be more independent in their emotional processing. Additionally, from a critical disability studies’ perspective, assistive technologies perpetuate harmful normalizing behaviors. However, these critical approaches can feel far from the frontline in terms of taking direct action to positively impact end users. From a critical yet more pragmatic perspective, projects such as Counterventions lists ways to reduce the likelihood of perpetuating ableism in interventionist’s work by reflectively analyzing a series of evolving assistive technology projects through a societal lens, thus leveraging the momentum of the evolving ecology of technologies for autism. Therefore, all current paradigms fall short of addressing the largest need—the negative impact of social stigma. The current work introduces a new paradigm for technologies for autism, borrowing from a paradigm introduced two decades ago around changing the narrative related to eating disorders. It is the shift from reprimanding poor habits to celebrating positive aspects of eating. This work repurposes Celebratory Technology for Neurodiversity and intended to reduce social stigma by targeting for the public at large. This presentation will review how requirements were derived from current research on autism social stigma as well as design sessions with autistic adults. Congruence between these two sources revealed three key design implications for technology: provide awareness of the autistic experience; generate acceptance of the neurodivergence; cultivate an appreciation for talents and accomplishments of neurodivergent people. The current pilot work in Celebratory Technology offers a new paradigm for supporting autism by shifting the burden of change from the person with autism to address changing society’s biases at large. Shifting the focus of research outside of the autistic body creates a new space for a design that extends beyond the bodies of a few and calls on all to embrace humanity as a whole.Keywords: neurodiversity, social stigma, accessibility, inclusion, celebratory technology
Procedia PDF Downloads 7228652 Round Addition DFA on Lightweight Block Ciphers with On-The-Fly Key Schedule
Authors: Hideki Yoshikawa, Masahiro Kaminaga, Arimitsu Shikoda, Toshinori Suzuki
Abstract:
Round addition differential fault analysis (DFA) using operation bypassing for lightweight block ciphers with on-the-fly key schedule is presented. For 64-bit KLEIN and 64-bit LED, it is shown that only a pair of correct ciphertext and faulty ciphertext can derive the secret master key. For PRESENT, one correct ciphertext and two faulty ciphertexts are required to reconstruct the secret key.Keywords: differential fault analysis (DFA), round addition, block cipher, on-the-fly key schedule
Procedia PDF Downloads 70228651 Multiaxial Fatigue Analysis of a High Performance Nickel-Based Superalloy
Authors: P. Selva, B. Lorraina, J. Alexis, A. Seror, A. Longuet, C. Mary, F. Denard
Abstract:
Over the past four decades, the fatigue behavior of nickel-based alloys has been widely studied. However, in recent years, significant advances in the fabrication process leading to grain size reduction have been made in order to improve fatigue properties of aircraft turbine discs. Indeed, a change in particle size affects the initiation mode of fatigue cracks as well as the fatigue life of the material. The present study aims to investigate the fatigue behavior of a newly developed nickel-based superalloy under biaxial-planar loading. Low Cycle Fatigue (LCF) tests are performed at different stress ratios so as to study the influence of the multiaxial stress state on the fatigue life of the material. Full-field displacement and strain measurements as well as crack initiation detection are obtained using Digital Image Correlation (DIC) techniques. The aim of this presentation is first to provide an in-depth description of both the experimental set-up and protocol: the multiaxial testing machine, the specific design of the cruciform specimen and performances of the DIC code are introduced. Second, results for sixteen specimens related to different load ratios are presented. Crack detection, strain amplitude and number of cycles to crack initiation vs. triaxial stress ratio for each loading case are given. Third, from fractographic investigations by scanning electron microscopy it is found that the mechanism of fatigue crack initiation does not depend on the triaxial stress ratio and that most fatigue cracks initiate from subsurface carbides.Keywords: cruciform specimen, multiaxial fatigue, nickel-based superalloy
Procedia PDF Downloads 29328650 Investigation of Pollution and the Physical and Chemical Condition of Polour River, East of Tehran, Iran
Authors: Azita Behbahaninia
Abstract:
This research has been carried out to determine the water quality and physico-chemical properties Polour River, one of the most branch of Haraz River. Polour River was studied for a period of one year Samples were taken from different stations along the main branch of River polour. In water samples determined pH, DO, SO4, Cl, PO4, NO3, EC, BOD, COD, Temprature, color and number of Caliform per liter. ArcGIS was used for the zoning of phosphate concentration in the polour River basin. The results indicated that the river is polluted in polour village station, because of discharge domestic wastewater and also river is polluted in Ziar village station, because of agricultural wastewater and water is contaminated in aquaculture station, because of fish ponds wastewater. Statistical analysis shows that between independent traits and coliform regression relationship is significant at the 1% level. Coefficient explanation index indicated independent traits control 80% coliform and 20 % is for unknown parameters. The causality analysis showed Temperature (0.6) has the most positive and direct effect on coliform and sulfate has direct and negative effect on coliform. The results of causality analysis and the results of the regression analysis are matched and other forms direct and indirect effects were negligible and ignorable. Kruskal-Wallis test showed, there is different between sampling stations and studied characters. Between stations for temperature, DO, COD, EC, sulfate and coliform is at 1 % and for phosphate 5 % level of significance.Keywords: coliform, GIS, pollution, phosphate, river
Procedia PDF Downloads 46628649 AniMoveMineR: Animal Behavior Exploratory Analysis Using Association Rules Mining
Authors: Suelane Garcia Fontes, Silvio Luiz Stanzani, Pedro L. Pizzigatti Corrła Ronaldo G. Morato
Abstract:
Environmental changes and major natural disasters are most prevalent in the world due to the damage that humanity has caused to nature and these damages directly affect the lives of animals. Thus, the study of animal behavior and their interactions with the environment can provide knowledge that guides researchers and public agencies in preservation and conservation actions. Exploratory analysis of animal movement can determine the patterns of animal behavior and with technological advances the ability of animals to be tracked and, consequently, behavioral studies have been expanded. There is a lot of research on animal movement and behavior, but we note that a proposal that combines resources and allows for exploratory analysis of animal movement and provide statistical measures on individual animal behavior and its interaction with the environment is missing. The contribution of this paper is to present the framework AniMoveMineR, a unified solution that aggregates trajectory analysis and data mining techniques to explore animal movement data and provide a first step in responding questions about the animal individual behavior and their interactions with other animals over time and space. We evaluated the framework through the use of monitored jaguar data in the city of Miranda Pantanal, Brazil, in order to verify if the use of AniMoveMineR allows to identify the interaction level between these jaguars. The results were positive and provided indications about the individual behavior of jaguars and about which jaguars have the highest or lowest correlation.Keywords: data mining, data science, trajectory, animal behavior
Procedia PDF Downloads 14328648 Application of Response Surface Methodology to Optimize the Factor Influencing the Wax Deposition of Malaysian Crude Oil
Authors: Basem Elarbe, Ibrahim Elganidi, Norida Ridzuan, Norhyati Abdullah
Abstract:
Wax deposition in production pipelines and transportation tubing from offshore to onshore is critical in the oil and gas industry due to low-temperature conditions. It may lead to a reduction in production, shut-in, plugging of pipelines and increased fluid viscosity. The most significant popular approach to solve this issue is by injection of a wax inhibitor into the channel. This research aims to determine the amount of wax deposition of Malaysian crude oil by estimating the effective parameters using (Design-Expert version 7.1.6) by response surface methodology (RSM) method. Important parameters affecting wax deposition such as cold finger temperature, inhibitor concentration and experimental duration were investigated. It can be concluded that SA-co-BA copolymer had a higher capability of reducing wax in different conditions where the minimum point of wax reduction was found at 300 rpm, 14℃, 1h, 1200 ppmThe amount of waxes collected for each parameter were 0.12g. RSM approach was applied using rotatable central composite design (CCD) to minimize the wax deposit amount. The regression model’s variance (ANOVA) results revealed that the R2 value of 0.9906, indicating that the model can be clarified 99.06% of the data variation, and just 0.94% of the total variation were not clarified by the model. Therefore, it indicated that the model is extremely significant, confirming a close agreement between the experimental and the predicted values. In addition, the result has shown that the amount of wax deposit decreased significantly with the increase of temperature and the concentration of poly (stearyl acrylate-co-behenyl acrylate) (SABA), which were set at 14°C and 1200 ppm, respectively. The amount of wax deposit was successfully reduced to the minimum value of 0.01 g after the optimization.Keywords: wax deposition, SABA inhibitor, RSM, operation factors
Procedia PDF Downloads 282