Search results for: Squared Error (SE) loss function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9570

Search results for: Squared Error (SE) loss function

6780 Enhancing Heavy Oil Recovery: Experimental Insights into Low Salinity Polymer in Sandstone Reservoirs

Authors: Intisar, Khalifa, Salim, Al Busaidi

Abstract:

Recently, the synergic combination of low salinity water flooding with polymer flooding has been a subject of paramount interest for the oil industry. Numerous studies have investigated the efficiency of enhanced oil recovery using low salinity polymer flooding (LSPF). However, there is no clear conclusion that can explain the incremental oil recovery, determine the main factors controlling the oil recovery process, and define the relative contribution of rock/fluids or fluid/fluid interactions to extra oil recovery. Therefore, this study aims to perform a systematic investigation of the interactions between oil, polymer, low salinity and sandstone rock surface from pore to core scale during LSPF. Partially hydrolyzed polyacrylamide (HPAM) polymer, Boise outcrop, a crude oil sample and reservoir cores from an Omani oil field, and brine at two different salinities were used in the study. Several experimental measurements including static bulk measurements of polymer solutions prepared with brines of high and low salinities, single phase displacement experiments, along with rheological, total organic carbon and ion chromatography measurements to analyze ion exchange reactions, polymer adsorption, and viscosity loss were used. In addition, two-phase experiments were performed to demonstrate the oil recovery efficiency of LSPF. The results revealed that the incremental oil recovery from LSPF was attributed to the combination of the reduction in the water-oil mobility ratio, an increase in the repulsion forces between crude oil/brine/rock interfaces and an increase in pH of the aqueous solution. In addition, lowering the salinity of the make-up brine resulted in a larger conformation (expansion) of the polymer molecules, which in turn resulted in less adsorption and a greater in-situ viscosity without any negative impact on injectivity. This plays a positive role in the oil displacement process. Moreover, the loss of viscosity in the effluent of polymer solutions was lower in low-salinity than in high-salinity brine, indicating that an increase in cations concentration (mainly driven by Ca2+ ions) has stronger effect on the viscosity of high-salinity polymer solution compared with low-salinity polymer.

Keywords: polymer, heavy oil, low salinity, COBR interactions

Procedia PDF Downloads 62
6779 Endothelial Progenitor Cells Is a Determinant of Vascular Function and Atherosclerosis in Ankylosing Spondylitis

Authors: Ashit Syngle, Inderjit Verma, Pawan Krishan

Abstract:

Objective: Endothelial progenitor cells (EPCs) have reparative potential in overcoming the endothelial dysfunction and reducing cardiovascular risk. EPC depletion has been demonstrated in the setting of established atherosclerotic diseases. With this background, we evaluated whether reduced EPCs population are associated with endothelial dysfunction, subclinical atherosclerosis and inflammatory markers in ankylosing spondylitis (AS) patients without any known traditional cardiovascular risk factor in AS patients. Methods: Levels of circulating EPCs (CD34+/CD133+), brachial artery flow-mediated dilatation, carotid intima-media thickness (CIMT) and inflammatory markers i.e erythrocyte sedimentation rate (ESR), C-reactive protein (CRP), tissue necrosis factor (TNF)–α, interleukin (IL)-6, IL-1 were assessed in 30 AS patients (mean age33.41 ± 10.25; 11 female and 19 male) who fulfilled the modified New York diagnostic criteria with 25 healthy volunteers (mean age 29.36± 8.64; 9 female and 16 male) matched for age and sex. Results: EPCs (CD34+/CD133+) cells were significantly (0.020 ± 0.001% versus 0.040 ± 0.010%, p<0.001) reduced in patients with AS compared to healthy controls. Endothelial function (7.35 ± 2.54 versus 10.27 ±1.73, p=0.002), CIMT (0.63 ± 0.01 versus 0.35 ± 0.02, p < 0.001) and inflammatory markers were also significantly (p < 0.01) altered as compared to healthy controls. Specifically, CD34+CD133+cells were inversely multivariate correlated with CRP and TNF-α and endothelial dysfunction was positively correlated with reduced number of EPC. Conclusion: Depletion of EPCs population is an independent predictor of endothelial dysfunction and early atherosclerosis in AS patients and may provide additional information beyond conventional risk factors and inflammatory markers.

Keywords: endothelial progenitor cells, atherosclerosis, ankylosing spondylitis, cardiovascular

Procedia PDF Downloads 365
6778 A Methodology Based on Image Processing and Deep Learning for Automatic Characterization of Graphene Oxide

Authors: Rafael do Amaral Teodoro, Leandro Augusto da Silva

Abstract:

Originated from graphite, graphene is a two-dimensional (2D) material that promises to revolutionize technology in many different areas, such as energy, telecommunications, civil construction, aviation, textile, and medicine. This is possible because its structure, formed by carbon bonds, provides desirable optical, thermal, and mechanical characteristics that are interesting to multiple areas of the market. Thus, several research and development centers are studying different manufacturing methods and material applications of graphene, which are often compromised by the scarcity of more agile and accurate methodologies to characterize the material – that is to determine its composition, shape, size, and the number of layers and crystals. To engage in this search, this study proposes a computational methodology that applies deep learning to identify graphene oxide crystals in order to characterize samples by crystal sizes. To achieve this, a fully convolutional neural network called U-net has been trained to segment SEM graphene oxide images. The segmentation generated by the U-net is fine-tuned with a standard deviation technique by classes, which allows crystals to be distinguished with different labels through an object delimitation algorithm. As a next step, the characteristics of the position, area, perimeter, and lateral measures of each detected crystal are extracted from the images. This information generates a database with the dimensions of the crystals that compose the samples. Finally, graphs are automatically created showing the frequency distributions by area size and perimeter of the crystals. This methodological process resulted in a high capacity of segmentation of graphene oxide crystals, presenting accuracy and F-score equal to 95% and 94%, respectively, over the test set. Such performance demonstrates a high generalization capacity of the method in crystal segmentation, since its performance considers significant changes in image extraction quality. The measurement of non-overlapping crystals presented an average error of 6% for the different measurement metrics, thus suggesting that the model provides a high-performance measurement for non-overlapping segmentations. For overlapping crystals, however, a limitation of the model was identified. To overcome this limitation, it is important to ensure that the samples to be analyzed are properly prepared. This will minimize crystal overlap in the SEM image acquisition and guarantee a lower error in the measurements without greater efforts for data handling. All in all, the method developed is a time optimizer with a high measurement value, considering that it is capable of measuring hundreds of graphene oxide crystals in seconds, saving weeks of manual work.

Keywords: characterization, graphene oxide, nanomaterials, U-net, deep learning

Procedia PDF Downloads 137
6777 Investigating the Process Kinetics and Nitrogen Gas Production in Anammox Hybrid Reactor with Special Emphasis on the Role of Filter Media

Authors: Swati Tomar, Sunil Kumar Gupta

Abstract:

Anammox is a novel and promising technology that has changed the traditional concept of biological nitrogen removal. The process facilitates direct oxidation of ammonical nitrogen under anaerobic conditions with nitrite as an electron acceptor without the addition of external carbon sources. The present study investigated the feasibility of anammox hybrid reactor (AHR) combining the dual advantages of suspended and attached growth media for biodegradation of ammonical nitrogen in wastewater. The experimental unit consisted of 4 nos. of 5L capacity AHR inoculated with mixed seed culture containing anoxic and activated sludge (1:1). The process was established by feeding the reactors with synthetic wastewater containing NH4-H and NO2-N in the ratio 1:1 at HRT (hydraulic retention time) of 1 day. The reactors were gradually acclimated to higher ammonium concentration till it attained pseudo steady state removal at a total nitrogen concentration of 1200 mg/l. During this period, the performance of the AHR was monitored at twelve different HRTs varying from 0.25-3.0 d with increasing NLR from 0.4 to 4.8 kg N/m3d. AHR demonstrated significantly higher nitrogen removal (95.1%) at optimal HRT of 1 day. Filter media in AHR contributed an additional 27.2% ammonium removal in addition to 72% reduction in the sludge washout rate. This may be attributed to the functional mechanism of filter media which acts as a mechanical sieve and reduces the sludge washout rate many folds. This enhances the biomass retention capacity of the reactor by 25%, which is the key parameter for successful operation of high rate bioreactors. The effluent nitrate concentration, which is one of the bottlenecks of anammox process was also minimised significantly (42.3-52.3 mg/L). Process kinetics was evaluated using first order and Grau-second order models. The first-order substrate removal rate constant was found as 13.0 d-1. Model validation revealed that Grau second order model was more precise and predicted effluent nitrogen concentration with least error (1.84±10%). A new mathematical model based on mass balance was developed to predict N2 gas in AHR. The mass balance model derived from total nitrogen dictated significantly higher correlation (R2=0.986) and predicted N2 gas with least error of precision (0.12±8.49%). SEM study of biomass indicated the presence of the heterogeneous population of cocci and rod shaped bacteria of average diameter varying from 1.2-1.5 mm. Owing to enhanced NRE coupled with meagre production of effluent nitrate and its ability to retain high biomass, AHR proved to be the most competitive reactor configuration for dealing with nitrogen laden wastewater.

Keywords: anammox, filter media, kinetics, nitrogen removal

Procedia PDF Downloads 359
6776 The Effect of Written Corrective Feedback on the Accurate Use of Grammatical Forms by Japanese Low-Intermediate EFL Learners

Authors: Ayako Hasegawa, Ken Ubukata

Abstract:

The purpose of this study is to investigate whether corrective feedback has any significant effect on Japanese low-intermediate EFL learners’ performance on a specific set of linguistic features. The subjects are Japanese college students majoring in English. They have studied English for about 7 years, but their inter-language seems to fossilize because non-target like errors is frequently observed in traditional deductive teacher-fronted approach. It has been reported that corrective feedback plays an important role in diminishing or overcoming inter-language fossilization and achieving TL competency. Therefore, it was examined how the corrective feedback (the focus of this study was metalinguistic feedback) and self-correction raised the students’ awareness and helped them notice the gaps between their inter-language and the TL.

Keywords: written corrective feedback, fossilized error, grammar teaching, language teaching

Procedia PDF Downloads 338
6775 Evaluation of Short-Term Load Forecasting Techniques Applied for Smart Micro-Grids

Authors: Xiaolei Hu, Enrico Ferrera, Riccardo Tomasi, Claudio Pastrone

Abstract:

Load Forecasting plays a key role in making today's and future's Smart Energy Grids sustainable and reliable. Accurate power consumption prediction allows utilities to organize in advance their resources or to execute Demand Response strategies more effectively, which enables several features such as higher sustainability, better quality of service, and affordable electricity tariffs. It is easy yet effective to apply Load Forecasting at larger geographic scale, i.e. Smart Micro Grids, wherein the lower available grid flexibility makes accurate prediction more critical in Demand Response applications. This paper analyses the application of short-term load forecasting in a concrete scenario, proposed within the EU-funded GreenCom project, which collect load data from single loads and households belonging to a Smart Micro Grid. Three short-term load forecasting techniques, i.e. linear regression, artificial neural networks, and radial basis function network, are considered, compared, and evaluated through absolute forecast errors and training time. The influence of weather conditions in Load Forecasting is also evaluated. A new definition of Gain is introduced in this paper, which innovatively serves as an indicator of short-term prediction capabilities of time spam consistency. Two models, 24- and 1-hour-ahead forecasting, are built to comprehensively compare these three techniques.

Keywords: short-term load forecasting, smart micro grid, linear regression, artificial neural networks, radial basis function network, gain

Procedia PDF Downloads 441
6774 Stability Bound of Ruin Probability in a Reduced Two-Dimensional Risk Model

Authors: Zina Benouaret, Djamil Aissani

Abstract:

In this work, we introduce the qualitative and quantitative concept of the strong stability method in the risk process modeling two lines of business of the same insurance company or an insurance and re-insurance companies that divide between them both claims and premiums with a certain proportion. The approach proposed is based on the identification of the ruin probability associate to the model considered, with a stationary distribution of a Markov random process called a reversed process. Our objective, after clarifying the condition and the perturbation domain of parameters, is to obtain the stability inequality of the ruin probability which is applied to estimate the approximation error of a model with disturbance parameters by the considered model. In the stability bound obtained, all constants are explicitly written.

Keywords: Markov chain, risk models, ruin probabilities, strong stability analysis

Procedia PDF Downloads 230
6773 Robust Inference with a Skew T Distribution

Authors: M. Qamarul Islam, Ergun Dogan, Mehmet Yazici

Abstract:

There is a growing body of evidence that non-normal data is more prevalent in nature than the normal one. Examples can be quoted from, but not restricted to, the areas of Economics, Finance and Actuarial Science. The non-normality considered here is expressed in terms of fat-tailedness and asymmetry of the relevant distribution. In this study a skew t distribution that can be used to model a data that exhibit inherent non-normal behavior is considered. This distribution has tails fatter than a normal distribution and it also exhibits skewness. Although maximum likelihood estimates can be obtained by solving iteratively the likelihood equations that are non-linear in form, this can be problematic in terms of convergence and in many other respects as well. Therefore, it is preferred to use the method of modified maximum likelihood in which the likelihood estimates are derived by expressing the intractable non-linear likelihood equations in terms of standardized ordered variates and replacing the intractable terms by their linear approximations obtained from the first two terms of a Taylor series expansion about the quantiles of the distribution. These estimates, called modified maximum likelihood estimates, are obtained in closed form. Hence, they are easy to compute and to manipulate analytically. In fact the modified maximum likelihood estimates are equivalent to maximum likelihood estimates, asymptotically. Even in small samples the modified maximum likelihood estimates are found to be approximately the same as maximum likelihood estimates that are obtained iteratively. It is shown in this study that the modified maximum likelihood estimates are not only unbiased but substantially more efficient than the commonly used moment estimates or the least square estimates that are known to be biased and inefficient in such cases. Furthermore, in conventional regression analysis, it is assumed that the error terms are distributed normally and, hence, the well-known least square method is considered to be a suitable and preferred method for making the relevant statistical inferences. However, a number of empirical researches have shown that non-normal errors are more prevalent. Even transforming and/or filtering techniques may not produce normally distributed residuals. Here, a study is done for multiple linear regression models with random error having non-normal pattern. Through an extensive simulation it is shown that the modified maximum likelihood estimates of regression parameters are plausibly robust to the distributional assumptions and to various data anomalies as compared to the widely used least square estimates. Relevant tests of hypothesis are developed and are explored for desirable properties in terms of their size and power. The tests based upon modified maximum likelihood estimates are found to be substantially more powerful than the tests based upon least square estimates. Several examples are provided from the areas of Economics and Finance where such distributions are interpretable in terms of efficient market hypothesis with respect to asset pricing, portfolio selection, risk measurement and capital allocation, etc.

Keywords: least square estimates, linear regression, maximum likelihood estimates, modified maximum likelihood method, non-normality, robustness

Procedia PDF Downloads 383
6772 Khaya Cellulose Supported Copper Nanoparticles for Chemo Selective Aza-Michael Reactions

Authors: M. Shaheen Sarkar, M. Lutfor Rahman, Mashitah Mohd Yusoff

Abstract:

We prepared a highly active Khaya cellulose supported poly(hydroxamic acid) copper nanoparticles by the surface modification of Khaya cellulose through graft co-polymerization and subsequently amidoximation. The Cu-nanoparticle (0.05 mol% to 50 mol ppm) was selectively promoted Aza-Michael reaction of aliphatic amines to give the corresponding alkylated products at room temperature in methanol. The supported nanoparticle was easy to recover and reused seven times without significance loss of its activity.

Keywords: Aza-Michael, copper, cellulose, nanoparticles, poly(hydroxamic acid)

Procedia PDF Downloads 313
6771 Nontuberculous Mycobacterium Infection – Still An Important Disease Among People With Late HIV Diagnosis

Authors: Jakub Młoźniak, Adam Szymański, Gabriela Stondzik, Dagny Krankowska, Tomasz Mikuła

Abstract:

Nontuberculous mycobacteria (NTM) are bacterial species that cause diversely manifesting diseases mainly in immunocompromised patients. In people with HIV, NTM infection is an AIDS-defining disease and usually appears when the lymphocyte T CD4 count is below 50 cells/μl. The usage of antiretroviral therapy has decreased the prevalence of NTM among people with HIV, but the disease can still be observed especially among patients with late HIV diagnosis. Common presence in environment, human colonization, clinical similarity with tuberculosis and slow growth on culture makes NTM especially hard to diagnose. The study aimed to analyze the epidemiology and clinical course of NTM among patients with HIV. This study included patients with NTM and HIV admitted to our department between 2017 and 2023. Medical records of patients were analyzed and data on age, sex, median time from HIV diagnosis to identification of NTM infection, median CD4 count at NTM diagnosis, methods of determining NTM infection, type of species of mycobacteria identified, clinical symptoms and treatment course were gathered. Twenty-four patients (20 men, 4 women) with identified NTM were included in this study. Among them, 20 were HIV late presenters. The patients' median age was 40. The main symptoms which patients presented were fever, weight loss and cough. Pulmonary disease confirmed with positive cultures from sputum/bronchoalveolar lavage was present in 18 patients. M. avium was the most common species identified. M. marinum caused disseminated skin lesions in 1 patient. Out of all, 5 people were not treated for NTM caused by lack of symptoms and suspicion of colonization with mycobacterium. Concomitant tuberculosis was present in 6 patients. The median diagnostic time from HIV to NTM infections was 3.5 months. The median CD4 count at NTM identification was 69.5 cells/μl. Median NTM treatment time was 16 months but 7 patients haven’t finished their treatment yet. The most commonly used medications were ethambutol and clarithromycin. Among analyzed patients, 4 of them have died. NTM infections are still an important disease among patients who are HIV late presenters. This disease should be taken into consideration during the differential diagnosis of fever, weight loss and cough in people with HIV with lymphocyte T CD4 count <100 cells/μl. Presence of tuberculosis does not exclude nontuberculous mycobacterium coinfection.

Keywords: mycobacteriosis, HIV, late presenter, epidemiology

Procedia PDF Downloads 20
6770 Gestalt in Music and Brain: A Non-Linear Chaos Based Study with Detrended/Adaptive Fractal Analysis

Authors: Shankha Sanyal, Archi Banerjee, Sayan Biswas, Sourya Sengupta, Sayan Nag, Ranjan Sengupta, Dipak Ghosh

Abstract:

The term ‘gestalt’ has been widely used in the field of psychology which defined the perception of human mind to group any object not in part but as a 'unified' whole. Music, in general, is polyphonic - i.e. a combination of a number of pure tones (frequencies) mixed together in a manner that sounds harmonious. The study of human brain response due to different frequency groups of the acoustic signal can give us an excellent insight regarding the neural and functional architecture of brain functions. Hence, the study of music cognition using neuro-biosensors is becoming a rapidly emerging field of research. In this work, we have tried to analyze the effect of different frequency bands of music on the various frequency rhythms of human brain obtained from EEG data. Four widely popular Rabindrasangeet clips were subjected to Wavelet Transform method for extracting five resonant frequency bands from the original music signal. These frequency bands were initially analyzed with Detrended/Adaptive Fractal analysis (DFA/AFA) methods. A listening test was conducted on a pool of 100 respondents to assess the frequency band in which the music becomes non-recognizable. Next, these resonant frequency bands were presented to 20 subjects as auditory stimulus and EEG signals recorded simultaneously in 19 different locations of the brain. The recorded EEG signals were noise cleaned and subjected again to DFA/AFA technique on the alpha, theta and gamma frequency range. Thus, we obtained the scaling exponents from the two methods in alpha, theta and gamma EEG rhythms corresponding to different frequency bands of music. From the analysis of music signal, it is seen that loss of recognition is proportional to the loss of long range correlation in the signal. From the EEG signal analysis, we obtain frequency specific arousal based response in different lobes of brain as well as in specific EEG bands corresponding to musical stimuli. In this way, we look to identify a specific frequency band beyond which the music becomes non-recognizable and below which in spite of the absence of other bands the music is perceivable to the audience. This revelation can be of immense importance when it comes to the field of cognitive music therapy and researchers of creativity.

Keywords: AFA, DFA, EEG, gestalt in music, Hurst exponent

Procedia PDF Downloads 303
6769 Heat Transfer and Diffusion Modelling

Authors: R. Whalley

Abstract:

The heat transfer modelling for a diffusion process will be considered. Difficulties in computing the time-distance dynamics of the representation will be addressed. Incomplete and irrational Laplace function will be identified as the computational issue. Alternative approaches to the response evaluation process will be provided. An illustration application problem will be presented. Graphical results confirming the theoretical procedures employed will be provided.

Keywords: heat, transfer, diffusion, modelling, computation

Procedia PDF Downloads 531
6768 Lessons Learnt from Industry: Achieving Net Gain Outcomes for Biodiversity

Authors: Julia Baker

Abstract:

Development plays a major role in stopping biodiversity loss. But the ‘silo species’ protection of legislation (where certain species are protected while many are not) means that development can be ‘legally compliant’ and result in biodiversity loss. ‘Net Gain’ (NG) policies can help overcome this by making it an absolute requirement that development causes no overall loss of biodiversity and brings a benefit. However, offsetting biodiversity losses in one location with gains elsewhere is controversial because people suspect ‘offsetting’ to be an easy way for developers to buy their way out of conservation requirements. Yet the good practice principles (GPP) of offsetting provide several advantages over existing legislation for protecting biodiversity from development. This presentation describes the learning from implementing NG approaches based on GPP. It regards major upgrades of the UK’s transport networks, which involved removing vegetation in order to construct and safely operate new infrastructure. While low-lying habitats were retained, trees and other habitats disrupting the running or safety of transport networks could not. Consequently, achieving NG within the transport corridor was not possible and offsetting was required. The first ‘lessons learnt’ were on obtaining a commitment from business leaders to go beyond legislative requirements and deliver NG, and on the institutional change necessary to embed GPP within daily operations. These issues can only be addressed when the challenges that biodiversity poses for business are overcome. These challenges included: biodiversity cannot be measured easily unlike other sustainability factors like carbon and water that have metrics for target-setting and measuring progress; and, the mindset that biodiversity costs money and does not generate cash in return, which is the opposite of carbon or waste for example, where people can see how ‘sustainability’ actions save money. The challenges were overcome by presenting the GPP of NG as a cost-efficient solution to specific, critical risks facing the business that also boost industry recognition, and by using government-issued NG metrics to develop business-specific toolkits charting their NG progress whilst ensuring that NG decision-making was based on rich ecological data. An institutional change was best achieved by supporting, mentoring and training sustainability/environmental managers for these ‘frontline’ staff to embed GPP within the business. The second learning was from implementing the GPP where business partnered with local governments, wildlife groups and land owners to support their priorities for nature conservation, and where these partners had a say in decisions about where and how best to achieve NG. From this inclusive approach, offsetting contributed towards conservation priorities when all collaborated to manage trade-offs between: -Delivering ecologically equivalent offsets or compensating for losses of one type of biodiversity by providing another. -Achieving NG locally to the development whilst contributing towards national conservation priorities through landscape-level planning. -Not just protecting the extent and condition of existing biodiversity but ‘doing more’. -The multi-sector collaborations identified practical, workable solutions to ‘in perpetuity’. But key was strengthening linkages between biodiversity measures implemented for development and conservation work undertaken by local organizations so that developers support NG initiatives that really count.

Keywords: biodiversity offsetting, development, nature conservation planning, net gain

Procedia PDF Downloads 170
6767 Estimation of Population Mean under Random Non-Response in Two-Occasion Successive Sampling

Authors: M. Khalid, G. N. Singh

Abstract:

In this paper, we have considered the problems of estimation for the population mean on current (second) occasion in two-occasion successive sampling under random non-response situations. Some modified exponential type estimators have been proposed and their properties are studied under the assumptions that the number of sampling unit follows a discrete distribution due to random non-response situations. The performances of the proposed estimators are compared with linear combinations of two estimators, (a) sample mean estimator for fresh sample and (b) ratio estimator for matched sample under the complete response situations. Results are demonstrated through empirical studies which present the effectiveness of the proposed estimators. Suitable recommendations have been made to the survey practitioners.

Keywords: modified exponential estimator, successive sampling, random non-response, auxiliary variable, bias, mean square error

Procedia PDF Downloads 331
6766 Morphological Analysis of English L1-Persian L2 Adult Learners’ Interlanguage: From the Perspective of SLA Variation

Authors: Maassoumeh Bemani Naeini

Abstract:

Studies on interlanguage have long been engaged in describing the phenomenon of variation in SLA. Pursuing the same goal and particularly addressing the role of linguistic features, this study describes the use of Persian morphology in the interlanguage of two adult English-speaking learners of Persian L2. Taking the general approach of a combination of contrastive analysis, error analysis and interlanguage analysis, this study focuses on the identification and prediction of some possible instances of transfer from English L1 to Persian L2 across six elicitation tasks aiming to investigate whether any of contextual features may variably influence the learners’ order of morpheme accuracy in the areas of copula, possessives, articles, demonstratives, plural form, personal pronouns, and genitive cases.  Results describe the existence of task variation in the interlanguage system of Persian L2 learners.

Keywords: English L1, Interlanguage Analysis, Persian L2, SLA variation

Procedia PDF Downloads 295
6765 Protective Effects of Genistein against Cyclophosphamide-Induced Hepatotoxicity in Rats: Involvement of Anti-Inflammatory and Anti-Oxidant Activities

Authors: Dina F. Mansour, Dalia O. Saleh, Rasha E. Mostafa

Abstract:

Cyclophosphamide (CP), the most commonly used chemotherapeutic agent, was reported to cause many side effects including urotoxicity, cardiotoxicity, gonadotoxicity, and hepatotoxicity; this limits its clinical practice. In the present study, the protective effect of genistein (GEN), the major phytoestrogen in soy products that possesses various pharmacological activities, has been investigated against CP-induced acute liver damage in rats. Forty adult Sprague-Dawley rats were allocated into five groups. The first group received the vehicles and act as normal control. In the other groups, rats were injected with a single dose of CP (200 mg/kg, i.p). The last three groups were pretreated with subcutaneous GEN at doses of 0.5, 1 and 2 mg/kg/day, respectively, for 15 consecutive days prior CP injection. Forty-eight hours following CP injection, rats of all groups were investigated for the serum levels of alanine transaminase and aspartate transaminase, as well as the liver contents of reduced glutathione, malondialdehyde, nitrite, interleukin-1β, and myeloperoxidase. Histopathological examination of liver tissues was also conducted. CP resulted in acute liver damage in rats as evidenced by alteration of liver function biomarkers, oxidative stress, and inflammatory markers; that was confirmed by the histopathological outcomes. Pretreatment of rats with GEN significantly protected against CP-induced deterioration of liver function and showed marked anti-oxidant and anti-inflammatory properties that were demonstrated by the biochemical and histopathological findings. In conclusion, the present findings demonstrated the protective effects of GEN against CP-induced liver damage and suggested role of its antioxidant and anti-inflammatory activities.

Keywords: cyclophosphamide, genistein, inflammation, interleukin-1β, liver, myeloperoxidase, oxidative stress

Procedia PDF Downloads 282
6764 Non-Linear Load-Deflection Response of Shape Memory Alloys-Reinforced Composite Cylindrical Shells under Uniform Radial Load

Authors: Behrang Tavousi Tehrani, Mohammad-Zaman Kabir

Abstract:

Shape memory alloys (SMA) are often implemented in smart structures as the active components. Their ability to recover large displacements has been used in many applications, including structural stability/response enhancement and active structural acoustic control. SMA wires or fibers can be embedded with composite cylinders to increase their critical buckling load, improve their load-deflection behavior, and reduce the radial deflections under various thermo-mechanical loadings. This paper presents a semi-analytical investigation on the non-linear load-deflection response of SMA-reinforced composite circular cylindrical shells. The cylinder shells are under uniform external pressure load. Based on first-order shear deformation shell theory (FSDT), the equilibrium equations of the structure are derived. One-dimensional simplified Brinson’s model is used for determining the SMA recovery force due to its simplicity and accuracy. Airy stress function and Galerkin technique are used to obtain non-linear load-deflection curves. The results are verified by comparing them with those in the literature. Several parametric studies are conducted in order to investigate the effect of SMA volume fraction, SMA pre-strain value, and SMA activation temperature on the response of the structure. It is shown that suitable usage of SMA wires results in a considerable enhancement in the load-deflection response of the shell due to the generation of the SMA tensile recovery force.

Keywords: airy stress function, cylindrical shell, Galerkin technique, load-deflection curve, recovery stress, shape memory alloy

Procedia PDF Downloads 165
6763 Effects of Occupational Therapy on Children with Unilateral Cerebral Palsy

Authors: Sedef Şahin, Meral Huri

Abstract:

Cerebral Palsy (CP) represents the most frequent cause of physical disability in children with a rate of 2,9 per 1000 live births. The activity-focused intervention is known to improve function and reduce activity limitations and barriers to participation of children with disabilities. The aim of the study was to assess the effects of occupational therapy on level of fatigue, activity performance and satisfaction in children with Unilateral Cerebral Palsy. Twenty-two children with hemiparetic cerebral palsy (mean age: 9,3 ± 2.1years; Gross Motor Function Classification System ( GMFCS) level from I to V (I = 54%, II = 23%, III = 14%, IV= 9%, V= 0%), Manual Ability Classification System (MACS) level from I to V (I = 40%, II = 32%, III = 14%, IV= 10%, V= 4%), were assigned to occupational therapy program for 6 weeks.Visual Analogue Scale (VAS) was used for intensity of the fatigue they experienced at the time on a 10 point Likert scale (1-10).Activity performance and satisfaction were measured with Canadian Occupational Performance Measure (COPM).A client-centered occupational therapy intervention was designed according to results of COPM. The results were compared with nonparametric Wilcoxon test before and after the intervention. Thirteen of the children were right-handed, whereas nine of the children were left handed.Six weeks of intervention showed statistically significant differences in level of fatigue, compared to first assessment(p<0,05). The mean score of first and the second activity performance scores were 4.51 ± 1.70 and 7.35 ± 2.51 respectively. Statistically significant difference between performance scores were found (p<0.01). The mean scores of first and second activity satisfaction scores were of 2.30± 1.05 and 5.51 ± 2.26 respectively. Statistically significant difference between satisfaction assessments were found (p<0.01). Occupational therapy is an evidence-based approach and occupational therapy interventions implemented by therapists were clinically effective on severity of fatigue, activity performance and satisfaction if implemented individually during 6 weeks.

Keywords: activity performance, cerebral palsy, fatigue, occupational therapy

Procedia PDF Downloads 216
6762 Problem Solving in Chilean Higher Education: Figurations Prior in Interpretations of Cartesian Graphs

Authors: Verónica Díaz

Abstract:

A Cartesian graph, as a mathematical object, becomes a tool for configuration of change. Its best comprehension is done through everyday life problem-solving associated with its representation. Despite this, the current educational framework favors general graphs, without consideration of their argumentation. Students are required to find the mathematical function without associating it to the development of graphical language. This research describes the use made by students of configurations made prior to Cartesian graphs with regards to an everyday life problem related to a time and distance variation phenomenon. The theoretical framework describes the function conditions of study and their modeling. This is a qualitative, descriptive study involving six undergraduate case studies that were carried out during the first term in 2016 at University of Los Lagos. The research problem concerned the graphic modeling of a real person’s movement phenomenon, and two levels of analysis were identified. The first level aims to identify local and global graph interpretations; a second level describes the iconicity and referentiality degree of an image. According to the results, students were able to draw no figures before the Cartesian graph, highlighting the need for students to represent the context and the movement of which causes the phenomenon change. From this, they managed Cartesian graphs representing changes in position, therefore, achieved an overall view of the graph. However, the local view only indicates specific events in the problem situation, using graphic and verbal expressions to represent movement. This view does not enable us to identify what happens on the graph when the movement characteristics change based on possible paths in the person’s walking speed.

Keywords: cartesian graphs, higher education, movement modeling, problem solving

Procedia PDF Downloads 196
6761 The Size Effects of Keyboards (Keycaps) on Computer Typing Tasks

Authors: Chih-Chun Lai, Jun-Yu Wang

Abstract:

The keyboard is the most important equipment for computer tasks. However, improper design of keyboard would cause some symptoms like ulnar and/or radial deviations. The research goal of this study was to investigate the optimal size(s) of keycaps to increase efficiency. As shown in the questionnaire pre-study with 49 participants aged from 20 to 44, the most commonly used keyboards were 101-key standard keyboards. Most of the keycap sizes (W × L) were 1.3 × 1.5 cm and 1.5 × 1.5 cm. The fingertip breadths of most participants were 1.2 cm. Therefore, in the main study with 18 participants, a standard keyboard with each set of the 3-sized (1.2 × 1.4 cm, 1.3 × 1.5 cm, and 1.5 × 1.5 cm) keycaps was used to investigate their typing efficiency, respectively. The results revealed that the differences between the operating times for using 1.3 × 1.5 cm and 1.2 × 1.4 cm keycaps were insignificant while operating times for using 1.5 × 1.5 cm keycaps were significantly longer than for using 1.2 × 1.4 cm or 1.3 × 1.5 cm, respectively. As for the typing error rate, there was no significant difference.

Keywords: keyboard, keycap size, typing efficiency, computer tasks

Procedia PDF Downloads 359
6760 BIASS in the Estimation of Covariance Matrices and Optimality Criteria

Authors: Juan M. Rodriguez-Diaz

Abstract:

The precision of parameter estimators in the Gaussian linear model is traditionally accounted by the variance-covariance matrix of the asymptotic distribution. However, this measure can underestimate the true variance, specially for small samples. Traditionally, optimal design theory pays attention to this variance through its relationship with the model's information matrix. For this reason it seems convenient, at least in some cases, adapt the optimality criteria in order to get the best designs for the actual variance structure, otherwise the loss in efficiency of the designs obtained with the traditional approach may be very important.

Keywords: correlated observations, information matrix, optimality criteria, variance-covariance matrix

Procedia PDF Downloads 409
6759 Evaluation of Natural Gums: Gum Tragacanth, Xanthan Gum, Guar Gum and Gum Acacia as Potential Hemostatic Agents

Authors: Himanshu Kushwah, Nidhi Sandal, Meenakshi K. Chauhan, Gaurav Mittal

Abstract:

Excessive bleeding is the primary factor of avoidable death in both civilian trauma centers as well as the military battlefield. Hundreds of Indian troops die every year due to blood loss caused by combat-related injuries. These deaths are avoidable and can be prevented to a large extent by making available a suitable hemostatic dressing in an emergency medical kit. In this study, natural gums were evaluated as potential hemostatic agents in combination with calcium gluconate. The study compares the hemostatic activity of Gum Tragacanth (GT), Guar Gum (GG), Xanthan Gum (XG) and Gum Acacia (GA) by carrying out different in-vitro and in-vivo studies. In-vitro studies were performed using the Lee-White method and Eustrek method, which includes the visual and microscopic analysis of blood clotting. MTT assay was also performed using human lymphocytes to check the cytotoxicity of the gums. The in-vivo studies were performed in Sprague Dawley rats using tail bleeding assay to evaluate the hemostatic efficacy of the gums and compared with a commercially available hemostatic sponge, Surgispon. Erythrocyte agglutination test was also performed to check the interaction between blood cells and the natural gums. Other parameters like blood loss, adherence strength of the developed hemostatic dressing material incorporating these gums, re-bleeding, and survival of the animals were also studied. The data obtained from the MTT assay showed that Guar gum, Gum Tragacanth, and Gum Acacia were not significantly cytotoxic, but substantial cytotoxicity was observed in Xanthan gum samples at high concentrations. Also, Xanthan gum took the least time with its minimum concentration to achieve hemostasis, (approximately 50 seconds at 3mg concentration). Gum Tragacanth also showed efficient hemostasis at a concentration of 35mg at the same time, but the other two gums tested were not able to clot the blood in significantly less time. A sponge dressing made of Tragacanth gum was found to be more efficient in achieving hemostasis and showed better practical applicability among all the gums studied and also when compared to the commercially available product, Surgispon, thus making it a potentially better alternative.

Keywords: cytotoxicity, hemostasis, natural gums, sponge

Procedia PDF Downloads 126
6758 Dry Sliding Wear Behaviour of Ti3SiC2 and the Effect of TiC on Its

Authors: Bendaoudi Seif-Eddine, Bounazef Mokhtar

Abstract:

Wear behaviour of Ti3SiC2 coating in contact sliding under dry condition have been investigated on different pressures (0.1-0.8 MPa) at various speeds from 5 to 60 m/s. The ball-on-disc sliding-wear test was performed in ambient air with a relative humidity of 20%. An equation has been proposed to predict wear rates and describe sliding wear caused by Corundum ball on the studied material. The results show how the wear rate, measured by mass loss, varies in the range of (0.6 – 3.8 x E-6 mm3/Nm) with normal sliding distance under various test conditions; it increases with increasing load and rapidly with speed. The influence of TiC impurities on the wear behaviours was also investigated.

Keywords: ball-on-disc, dry-sliding, Ti3SiC2, wear

Procedia PDF Downloads 245
6757 Decrease in Olfactory Cortex Volume and Alterations in Caspase Expression in the Olfactory Bulb in the Pathogenesis of Alzheimer’s Disease

Authors: Majed Al Otaibi, Melissa Lessard-Beaudoin, Amel Loudghi, Raphael Chouinard-Watkins, Melanie Plourde, Frederic Calon, C. Alexandre Castellano, Stephen Cunnane, Helene Payette, Pierrette Gaudreau, Denis Gris, Rona K. Graham

Abstract:

Introduction: Alzheimer disease (AD) is a chronic disorder that affects millions of individuals worldwide. Symptoms include memory dysfunction, and also alterations in attention, planning, language and overall cognitive function. Olfactory dysfunction is a common symptom of several neurological disorders including AD. Studying the mechanisms underlying the olfactory dysfunction may therefore lead to the discovery of potential biomarkers and/or treatments for neurodegenerative diseases. Objectives: To determine if olfactory dysfunction predicts future cognitive impairment in the aging population and to characterize the olfactory system in a murine model expressing a genetic factor of AD. Method: For the human study, quantitative olfactory tests (UPSIT and OMT) have been done on 93 subjects (aged 80 to 94 years) from the Quebec Longitudinal Study on Nutrition and Successful Aging (NuAge) cohort accepting to participate in the ORCA secondary study. The telephone Modified Mini Mental State examination (t-MMSE) was used to assess cognition levels, and an olfactory self-report was also collected. In a separate cohort, olfactory cortical volume was calculated using MRI results from healthy old adults (n=25) and patients with AD (n=18) using the AAL single-subject atlas and performed with the PNEURO tool (PMOD 3.7). For the murine study, we are using Western blotting, RT-PCR and immunohistochemistry. Result: Human Study: Based on the self-report, 81% of the participants claimed to not suffer from any problem with olfaction. However, based on the UPSIT, 94% of those subjects showed a poor olfactory performance and different forms of microsmia. Moreover, the results confirm that olfactory function declines with age. We also detected a significant decrease in olfactory cortical volume in AD individuals compared to controls. Murine study: Preliminary data demonstrate there is a significant decrease in expression levels of the proform of caspase-3 and the caspase substrate STK3, in the olfactory bulb of mice expressing human APOE4 compared with controls. In addition, there is a significant decrease in the expression level of the caspase-9 proform and caspase-8 active fragment. Analysis of the mature neuron marker, NeuN, shows decreased expression levels of both isoforms. The data also suggest that Iba-1 immunostaining is increased in the olfactory bulb of APOE4 mice compared to wild type mice. Conclusions: The activation of caspase-3 may be the cause of the decreased levels of STK3 through caspase cleavage and may play role in the inflammation observed. In the clinical study, our results suggest that seniors are unaware of their olfactory function status and therefore it is not sufficient to measure olfaction using the self-report in the elderly. Studying olfactory function and cognitive performance in the aging population will help to discover biomarkers in the early stage of the AD.

Keywords: Alzheimer's disease, APOE4, cognition, caspase, brain atrophy, neurodegenerative, olfactory dysfunction

Procedia PDF Downloads 235
6756 Permeable Reactive Pavement for Controlling the Transport of Benzene, Toluene, Ethyl-Benzene, and Xylene (BTEX) Contaminants

Authors: Shengyi Huang, Chenju Liang

Abstract:

Volatile organic compounds such as benzene, toluene, ethyl-benzene, and xylene (BTEX) are common contaminants in environment, which could come from asphalt concrete or exhaust emissions of vehicles. The BTEX may invade to the subsurface environment via wet and dry atmospheric depositions. If there aren’t available ways for controlling contaminants’ fate and transport, they would extensively harm natural environment. In the 1st phase of this study, various adsorbents were screened for a suitable one to be an additive in the porous asphalt mixture. In the 2nd phase, addition of the selected adsorbent was incorporated with the design of porous asphalt concrete (PAC) to produce the permeable reactive pavement (PRP), which was subsequently tested for the potential of adsorbing aqueous BTEX as compared to the PAC, in the 3rd phase. The PRP was prepared according to the following steps: firstly, the suitable adsorbent was chosen based on the analytical results of specific surface area analysis, thermal-gravimetric analysis, adsorption kinetics and isotherms, and thermal dynamics analysis; secondly, the materials of coarse aggregate, fine aggregate, filler, asphalt, and fiber were tested in order to meet regulated specifications (e.g., water adsorption, soundness, viscosity etc.) for preparing the PRP; thirdly, the amount of adsorbent additive was determined in the PRP; fourthly, the prepared PAC and PRP were examined for their physical properties (e.g., abrasion loss, drain-down loss, Marshall stability, Marshall flow, dynamic stability etc.). As a result of comparison between PRP and PAC, the PRP showed better physical performance than the traditional PAC. At last, the Marshall Specimen column tests were conducted to explore the adsorption capacities of PAC and PRPs. The BTEX adsorption capacities of PRPs are higher than those obtained from traditional PAC. In summary, PRPs showed superior physical performance and adsorption capacities, which exhibit the potential of PRP to be applied as a replacement of PAC for better controlling the transport of non-point source pollutants.

Keywords: porous asphalt concrete, volatile organic compounds, permeable reactive pavement, non-point source pollution

Procedia PDF Downloads 188
6755 Attribute Selection for Preference Functions in Engineering Design

Authors: Ali E. Abbas

Abstract:

Industrial Engineering is a broad multidisciplinary field with intersections and applications in numerous areas. When designing a product, it is important to determine the appropriate attributes of value and the preference function for which the product is optimized. This paper provides some guidelines on appropriate selection of attributes for preference and value functions for engineering design.

Keywords: decision analysis, industrial engineering, direct vs. indirect values, engineering management

Procedia PDF Downloads 281
6754 Detection of High Fructose Corn Syrup in Honey by Near Infrared Spectroscopy and Chemometrics

Authors: Mercedes Bertotto, Marcelo Bello, Hector Goicoechea, Veronica Fusca

Abstract:

The National Service of Agri-Food Health and Quality (SENASA), controls honey to detect contamination by synthetic or natural chemical substances and establishes and controls the traceability of the product. The utility of near-infrared spectroscopy for the detection of adulteration of honey with high fructose corn syrup (HFCS) was investigated. First of all, a mixture of different authentic artisanal Argentinian honey was prepared to cover as much heterogeneity as possible. Then, mixtures were prepared by adding different concentrations of high fructose corn syrup (HFCS) to samples of the honey pool. 237 samples were used, 108 of them were authentic honey and 129 samples corresponded to honey adulterated with HFCS between 1 and 10%. They were stored unrefrigerated from time of production until scanning and were not filtered after receipt in the laboratory. Immediately prior to spectral collection, honey was incubated at 40°C overnight to dissolve any crystalline material, manually stirred to achieve homogeneity and adjusted to a standard solids content (70° Brix) with distilled water. Adulterant solutions were also adjusted to 70° Brix. Samples were measured by NIR spectroscopy in the range of 650 to 7000 cm⁻¹. The technique of specular reflectance was used, with a lens aperture range of 150 mm. Pretreatment of the spectra was performed by Standard Normal Variate (SNV). The ant colony optimization genetic algorithm sample selection (ACOGASS) graphical interface was used, using MATLAB version 5.3, to select the variables with the greatest discriminating power. The data set was divided into a validation set and a calibration set, using the Kennard-Stone (KS) algorithm. A combined method of Potential Functions (PF) was chosen together with Partial Least Square Linear Discriminant Analysis (PLS-DA). Different estimators of the predictive capacity of the model were compared, which were obtained using a decreasing number of groups, which implies more demanding validation conditions. The optimal number of latent variables was selected as the number associated with the minimum error and the smallest number of unassigned samples. Once the optimal number of latent variables was defined, we proceeded to apply the model to the training samples. With the calibrated model for the training samples, we proceeded to study the validation samples. The calibrated model that combines the potential function methods and PLSDA can be considered reliable and stable since its performance in future samples is expected to be comparable to that achieved for the training samples. By use of Potential Functions (PF) and Partial Least Square Linear Discriminant Analysis (PLS-DA) classification, authentic honey and honey adulterated with HFCS could be identified with a correct classification rate of 97.9%. The results showed that NIR in combination with the PT and PLS-DS methods can be a simple, fast and low-cost technique for the detection of HFCS in honey with high sensitivity and power of discrimination.

Keywords: adulteration, multivariate analysis, potential functions, regression

Procedia PDF Downloads 104
6753 Study of the Vertical Handoff in Heterogeneous Networks and Implement Based on Opnet

Authors: Wafa Benaatou, Adnane Latif

Abstract:

In this document we studied more in detail the Performances of the vertical handover in the networks WLAN, WiMAX, UMTS before studying of it the Procedure of Handoff Vertical, the whole buckled by simulations putting forward the performances of the handover in the heterogeneous networks. The goal of Vertical Handover is to carry out several accesses in real-time in the heterogeneous networks. This makes it possible a user to use several networks (such as WLAN UMTS and WiMAX) in parallel, and the system to commutate automatically at another basic station, without disconnecting itself, as if there were no cut and with little loss of data as possible.

Keywords: vertical handoff, WLAN, UMTS, WIMAX, heterogeneous

Procedia PDF Downloads 363
6752 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization

Procedia PDF Downloads 279
6751 A Comparative Study of Insurance Policies Worldwide in Public Private Partnerships

Authors: Guanqun Shi, Xueqing Zhang

Abstract:

The frequent occurrence of failures in PPP projects which caused great loss has raised attention from the government as well as the concessionaire. PPPs are complex arrangements for its long operation period and multiple players. Many types of risks in PPP projects may cause the project fail. The insurance is an important tool to transfer the risks. Through a comparison and analysis of international government PPP guidelines and contracts as well as the case studies worldwide, we have identified eight main insurance principles, discussed thirteen insurance types in different stages. An overall procedure would be established to improve the practices in PPP projects.

Keywords: public private partnerships, insurance, contract, risk

Procedia PDF Downloads 255