Search results for: accounting information quality
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18891

Search results for: accounting information quality

13881 Advanced Bio-Fuels for Biorefineries: Incorporation of Waste Tires and Calcium-Based Catalysts to the Pyrolysis of Biomass

Authors: Alberto Veses, Olga Sanhauja, María Soledad Callén, Tomás García

Abstract:

The appropriate use of renewable sources emerges as a decisive point to minimize the environmental impact caused by fossil fuels use. Particularly, the use of lignocellulosic biomass becomes one of the best promising alternatives since it is the only carbon-containing renewable source that can produce bioproducts similar to fossil fuels and it does not compete with food market. Among all the processes that can valorize lignocellulosic biomass, pyrolysis is an attractive alternative because it is the only thermochemical process that can produce a liquid biofuel (bio-oil) in a simple way and solid and gas fractions that can be used as energy sources to support the process. However, in order to incorporate bio-oils in current infrastructures and further process in future biorefineries, their quality needs to be improved. Introducing different low-cost catalysts and/or incorporating different polymer residues to the process are some of the new, simple and low-cost strategies that allow the user to directly obtain advanced bio-oils to be used in future biorefineries in an economic way. In this manner, from previous thermogravimetric analyses, local agricultural wastes such as grape seeds (GS) were selected as lignocellulosic biomass while, waste tires (WT) were selected as polymer residue. On the other hand, CaO was selected as low-cost catalyst based on previous experiences by the group. To reach this aim, a specially-designed fixed bed reactor using N₂ as a carrier gas was used. This reactor has the peculiarity to incorporate a vertical mobile liner that allows the user to introduce the feedstock in the oven once the selected temperature (550 ºC) is reached, ensuring higher heating rates needed for the process. Obtaining a well-defined phase distribution in the resulting bio-oil is crucial to ensure the viability to the process. Thus, once experiments were carried out, not only a well-defined two layers was observed introducing several mixtures (reaching values up to 40 wt.% of WT) but also, an upgraded organic phase, which is the one considered to be processed in further biorefineries. Radical interactions between GS and WT released during the pyrolysis process and dehydration reactions enhanced by CaO can promote the formation of better-quality bio-oils. The latter was reflected in a reduction of water and oxygen content of bio-oil and hence, a substantial increase of its heating value and its stability. Moreover, not only sulphur content was reduced from solely WT pyrolysis but also potential and negative issues related to a strong acidic environment of conventional bio-oils were minimized due to its basic pH and lower total acid numbers. Therefore, acidic compounds obtained in the pyrolysis such as CO₂-like substances can react with the CaO and minimize acidic problems related to lignocellulosic bio-oils. Moreover, this CO₂ capture promotes H₂ production from water gas shift reaction favoring hydrogen-transfer reactions, improving the final quality of the bio-oil. These results show the great potential of grapes seeds to carry out the catalytic co-pyrolysis process with different plastic residues in order to produce a liquid bio-oil that can be considered as a high-quality renewable vector.

Keywords: advanced bio-oils, biorefinery, catalytic co-pyrolysis of biomass and waste tires, lignocellulosic biomass

Procedia PDF Downloads 221
13880 Assessing Economic Losses Of 2104 Flood Disaster: A Case Study on Dabong, Kelantan, Malaysia

Authors: Ahmad Hamidi Mohamed, Jamaluddin Othman, Mashitah Suid, Mohd Zaim Mohd Shukri

Abstract:

Floods are considered an annual natural disaster in Kelantan. However, the record-setting flood of 2014 was a 'tsunami-like disaster'. A study has been conducted with the objectives to assess the economic impact of the flood to the resident of Dabong area in Kelantan Darul Naim, Malaysia. This area was selected due to the severity during the flood. The impacts of flood on local people were done by conducting structured interviews with the use of questionnaires. The questionnaire was intended to acquire information on losses faced by Dabong residence. Questionnaires covered various areas of inconveniences suffered with respect to health effects, including illnesses suffered, their intensities, duration and their associated costs. Loss of productivity and quality of life was also assessed. Inquiries were made to Government agencies to obtain relevant statistical data regarding the loss due to the flood tragedy. The data collected by giving formal request to the governmental agencies and formal meetings were done. From the study a staggering amount of losses were calculated. This figure comes from losses of property, Farmers/Agriculture, Traders/Business, Health, Insurance and Governmental losses. Flood brings hardship to the people of Dabong and these losses of home will cause inconveniences to the society. The huge amount of economic loss extracted from this study shows that federal and state government of Kelantan need to find out the cause of the major flood in 2014. Fast and effective measures have to be planned and implemented in flood prone area to prevent same tragedy happens in the future.

Keywords: economic impact, flood tragedy, Malaysia, property losses

Procedia PDF Downloads 252
13879 Predicting Groundwater Areas Using Data Mining Techniques: Groundwater in Jordan as Case Study

Authors: Faisal Aburub, Wael Hadi

Abstract:

Data mining is the process of extracting useful or hidden information from a large database. Extracted information can be used to discover relationships among features, where data objects are grouped according to logical relationships; or to predict unseen objects to one of the predefined groups. In this paper, we aim to investigate four well-known data mining algorithms in order to predict groundwater areas in Jordan. These algorithms are Support Vector Machines (SVMs), Naïve Bayes (NB), K-Nearest Neighbor (kNN) and Classification Based on Association Rule (CBA). The experimental results indicate that the SVMs algorithm outperformed other algorithms in terms of classification accuracy, precision and F1 evaluation measures using the datasets of groundwater areas that were collected from Jordanian Ministry of Water and Irrigation.

Keywords: classification, data mining, evaluation measures, groundwater

Procedia PDF Downloads 264
13878 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators

Authors: M. A. Okezue, K. L. Clase, S. R. Byrn

Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

Keywords: data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets

Procedia PDF Downloads 158
13877 Molecular Epidemiology of Anthrax in Georgia

Authors: N. G. Vepkhvadze, T. Enukidze

Abstract:

Anthrax is a fatal disease caused by strains of Bacillus anthracis, a spore-forming gram-positive bacillus that causes the disease anthrax in animals and humans. Anthrax is a zoonotic disease that is also well-recognized as a potential agent of bioterrorism. Infection in humans is extremely rare in the developed world and is generally due to contact with infected animals or contaminated animal products. Testing of this zoonotic disease began in 1907 in Georgia and is still being tested routinely to provide accurate information and efficient testing results at the State Laboratory of Agriculture of Georgia. Each clinical sample is analyzed by RT-PCR and bacteriology methods; this study used Real-Time PCR assays for the detection of B. anthracis that rely on plasmid-encoded targets with a chromosomal marker to correctly differentiate pathogenic strains from non-anthracis Bacillus species. During the period of 2015-2022, the State Laboratory of Agriculture (SLA) tested 250 clinical and environmental (soil) samples from several different regions in Georgia. In total, 61 out of the 250 samples were positive during this period. Based on the results, Anthrax cases are mostly present in Eastern Georgia, with a high density of the population of livestock, specifically in the regions of Kakheti and Kvemo Kartli. All laboratory activities are being performed in accordance with International Quality standards, adhering to biosafety and biosecurity rules by qualified and experienced personnel handling pathogenic agents. Laboratory testing plays the largest role in diagnosing animals with anthrax, which helps pertinent institutions to quickly confirm a diagnosis of anthrax and evaluate the epidemiological situation that generates important data for further responses.

Keywords: animal disease, baccilus anthracis, edp, laboratory molecular diagnostics

Procedia PDF Downloads 74
13876 Current Applications of Artificial Intelligence (AI) in Chest Radiology

Authors: Angelis P. Barlampas

Abstract:

Learning Objectives: The purpose of this study is to inform briefly the reader about the applications of AI in chest radiology. Background: Currently, there are 190 FDA-approved radiology AI applications, with 42 (22%) pertaining specifically to thoracic radiology. Imaging findings OR Procedure details Aids of AI in chest radiology1: Detects and segments pulmonary nodules. Subtracts bone to provide an unobstructed view of the underlying lung parenchyma and provides further information on nodule characteristics, such as nodule location, nodule two-dimensional size or three dimensional (3D) volume, change in nodule size over time, attenuation data (i.e., mean, minimum, and/or maximum Hounsfield units [HU]), morphological assessments, or combinations of the above. Reclassifies indeterminate pulmonary nodules into low or high risk with higher accuracy than conventional risk models. Detects pleural effusion . Differentiates tension pneumothorax from nontension pneumothorax. Detects cardiomegaly, calcification, consolidation, mediastinal widening, atelectasis, fibrosis and pneumoperitoneum. Localises automatically vertebrae segments, labels ribs and detects rib fractures. Measures the distance from the tube tip to the carina and localizes both endotracheal tubes and central vascular lines. Detects consolidation and progression of parenchymal diseases such as pulmonary fibrosis or chronic obstructive pulmonary disease (COPD).Can evaluate lobar volumes. Identifies and labels pulmonary bronchi and vasculature and quantifies air-trapping. Offers emphysema evaluation. Provides functional respiratory imaging, whereby high-resolution CT images are post-processed to quantify airflow by lung region and may be used to quantify key biomarkers such as airway resistance, air-trapping, ventilation mapping, lung and lobar volume, and blood vessel and airway volume. Assesses the lung parenchyma by way of density evaluation. Provides percentages of tissues within defined attenuation (HU) ranges besides furnishing automated lung segmentation and lung volume information. Improves image quality for noisy images with built-in denoising function. Detects emphysema, a common condition seen in patients with history of smoking and hyperdense or opacified regions, thereby aiding in the diagnosis of certain pathologies, such as COVID-19 pneumonia. It aids in cardiac segmentation and calcium detection, aorta segmentation and diameter measurements, and vertebral body segmentation and density measurements. Conclusion: The future is yet to come, but AI already is a helpful tool for the daily practice in radiology. It is assumed, that the continuing progression of the computerized systems and the improvements in software algorithms , will redder AI into the second hand of the radiologist.

Keywords: artificial intelligence, chest imaging, nodule detection, automated diagnoses

Procedia PDF Downloads 56
13875 Phytoremediation Aeration System by Using Water Lettuce (Pistia Stratiotes I) Based on Zero Waste to Reduce the Impact of Industrial Liquid Waste in Jember, Indonesia

Authors: Wahyu Eko Diyanto, Amalia Dyah Arumsari, Ulfatu Layinatinnahdiyah Arrosyadi

Abstract:

Tofu industry is one of the local food industry which is can being competitive industry in the ASEAN Economic Community (AEC). However, a lot of tofu entrepreneurs just thinking how to produce good quality product without considering the impact of environmental conditions from the production process. Production of tofu per day requires a number of 15 kg with liquid waste generated is 652.5 liters. That liquid waste is discharged directly into waterways, whereas tofu liquid waste contains organic compounds that quickly unraveled, so it can pollute waterways. In addition, tofu liquid waste is high in Biological Oxygen Demand (BOD), Chemical Oxygen Demand (COD), Total Suspended Solid (TSS), nitrogen and phosphorus. This research is aim to create a method of handling liquid waste effectively and efficiently by using water lettuce. The method is done by observation and experiment by using phytoremediation method in the tofu liquid waste using water lettuce and adding aeration to reduce the concentration of contaminants. The results of the research analyzed the waste quality standard parameters based on SNI (National Standardization Agency of Indonesia). The efficiency concentration and parameters average of tofu liquid waste are obtained pH 3,42% (from 4,0 to be 3,3), COD 76,13% (from 3579 ppm to be 854 ppm), BOD 55 % (from 11600 ppm to be 5242 ppm), TSS 93,6% (from 3174 ppm to be 203 ppm), turbidity is 64,8% (from 977 NTU to be 1013 NTU), and temperature 36oC (from 45oC to be 40oC). The efficiency of these parameters indicates a safe value for the effluent to be channeled in waterways. Water lettuce and tofu liquid waste phytoremediation result will be used as biogas as renewable energy.

Keywords: aeration, phytoremediation, water letuce, tofu liquid waste

Procedia PDF Downloads 372
13874 Prospective Study to Determine the Efficacy of Day Hospital Care to Improve Treatment Adherence for Hospitalized Schizophrenic Patients

Authors: Jin Hun Choi, So Hyun Ahn, Seong Keun Wang, Ik-Seung Chee, Jung Lan Kim, Sun Woo Lee

Abstract:

Objectives: The purpose of the study is to investigate the effects of day hospital care in hospitalized schizophrenic patients in terms of treatment adherence and treatment outcomes. Methods: Among schizophrenic patients hospitalized between 2011 and 2012, 23 day hospital care patient and 40 control subjects were included in the study. All candidates underwent Beck Cognitive Insight Scale, Drug Attitude Inventory, World Health Organization Quality of Life Assessment and Psychological Well-Being Scale when their symptoms were stabilized during hospitalization, and after being discharged, 23 patients received day hospital care for two months and then changed to out-patient care while 40 patients received out-patient care immediately after discharge. At the point of two months of out-patient care, the treatment adherence of the two groups was evaluated; tracking observation was performed until February, 2013, and survival rates were compared between the two groups. Results: Treatment adherence was higher in the day hospital care group than in the control group. Kaplan-Meier survival analysis showed a higher survival rate for the day hospital care group compared to the control group. Levels of cognitive insight and quality of life were higher after day hospital care than before day hospital care in the day hospital care group. Conclusions: Through the study, it was confirmed that when hospitalized schizophrenic patients received continuous day hospital care after being discharged, they received further out-patient care more faithfully. The study is considered to aid in the understanding regarding schizophrenic patients’ treatment adherence issues and improvement of treatment outcomes.

Keywords: schizophrenia, day hospital care, adherence, outcomes

Procedia PDF Downloads 342
13873 E-government Status and Impact on Development in the Arab Region

Authors: Sukaina Al-Nasrawi, Maysoun Ibrahim

Abstract:

Information and communication technologies (ICT) have affected recent public administration and governance. Electronic Government (e-government) services were developed to simplify government procedures and improve interaction with citizens on one hand and to create new governance models to empower citizens and involve them in the decision-making process while increasing transparency on another hand. It is worth noting that efficient governance models enable sustainable development at the social and economic levels. Currently, the status of e-government national strategies and implementation programs vary from one country to another. This variance in the development levels of e-government initiatives and applications noted the digital divide between countries of the same region, thereby highlighting the difficulty to reach regional integration. Many Arab countries realized the need for a well-articulated e-government strategy and launched national e-government initiatives. In selected Arab countries, the focus of e-government initiatives and programs shifted from the provision of services to advanced concepts such as open data initiatives. This paper aims at over viewing the e-government achievements of Arab countries and areas for enhancement, and share best practices in the area.of the best e-government programmes from the Arab region the world. It will also shed the light on the impact of the information society in general and e-government, in specific, on the social and economic development in the Arab region.

Keywords: Information and Communication Technologies (ICT), services, e-government, development, Arab region, digital divide, citizens

Procedia PDF Downloads 275
13872 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets

Authors: Ece Cigdem Mutlu, Burak Alakent

Abstract:

Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.

Keywords: average run length, M-estimators, quality control, robust estimators

Procedia PDF Downloads 177
13871 Modelling of Passengers Exchange between Trains and Platforms

Authors: Guillaume Craveur

Abstract:

The evaluation of the passenger exchange time is necessary for railway operators in order to optimize and dimension rail traffic. Several influential parameters are identified and studied. Each parameter leads to a modeling completed with the buildingEXODUS software. The objective is the modelling of passenger exchanges measured by passenger counting. Population size is dimensioned using passenger counting files which are a report of the train service and contain following useful informations: number of passengers who get on and leave the train, exchange time. These information are collected by sensors placed at the top of each train door. With passenger counting files it is possible to know how many people are engaged in the exchange and how long is the exchange, but it is not possible to know passenger flow of the door. All the information about observed exchanges are thus not available. For this reason and in order to minimize inaccuracies, only short exchanges (less than 30 seconds) with a maximum of people are performed.

Keywords: passengers exchange, numerical tools, rolling stock, platforms

Procedia PDF Downloads 214
13870 Understanding the Origins of Pesticides Metabolites in Natural Waters through the Land Use, Hydroclimatic Conditions and Water Quality

Authors: Alexis Grandcoin, Stephanie Piel, Estelle Baures

Abstract:

Brittany (France) is an agricultural region, where emerging pollutants are highly at risk to reach water bodies. Among them, pesticides metabolites are frequently detected in surface waters. The Vilaine watershed (11 000 km²) is of great interest, as a large drinking water treatment plant (100 000 m³/day) is located at the extreme downstream of it. This study aims to provide an evaluation of the pesticides metabolites pollution in the Vilaine watershed, and an understanding of their availability, in order to protect the water resource. Hydroclimatic conditions, land use, and water quality parameters controlling metabolites availability are emphasized. Later this knowledge will be used to understand the favoring conditions resulting in metabolites export towards surface water. 19 sampling points have been strategically chosen along the 220 km of the Vilaine river and its 3 main influents. Furthermore, the intakes of two drinking water plants have been sampled, one is located at the extreme downstream of the Vilaine river and the other is the riparian groundwater under the Vilaine river. 5 sampling campaigns with various hydroclimatic conditions have been carried out. Water quality parameters and hydroclimatic conditions have been measured. 15 environmentally relevant pesticides and metabolites have been analyzed. Also, these compounds are recalcitrant to classic water treatment that is why they have been selected. An evaluation of the watershed contamination has been done in 2016-2017. First observations showed that aminomethylphosphonic acid (AMPA) and metolachlor ethanesulfonic acid (MESA) are the most detected compounds in surface waters samples with 100 % and 98 % frequency of detection respectively. They are the main pollutants of the watershed regardless of the hydroclimatic conditions. AMPA concentration in the river strongly increases downstream of Rennes agglomeration (220k inhabitants) and reaches a maximum of 2.3 µg/l in low waters conditions. Groundwater contains mainly MESA, Diuron and metazachlor ESA at concentrations close to limits of quantification (LOQ) (0.02 µg/L). Metolachlor, metazachlor and alachlor due to their fast degradation in soils were found in small amounts (LOQ – 0.2 µg/L). Conversely glyphosate was regularly found during warm and sunny periods up to 0.6 µg/L. Soil uses (agricultural cultures types, urban areas, forests, wastewater treatment plants implementation), water quality parameters, and hydroclimatic conditions have been correlated to pesticides and metabolites concentration in waters. Statistical treatments showed that chloroacetamides metabolites and AMPA behave differently regardless of the hydroclimatic conditions. Chloroacetamides are correlated to each other, to agricultural areas and to typical agricultural tracers as nitrates. They are present in waters the whole year, especially during rainy periods, suggesting important stocks in soils. Also Chloroacetamides are negatively correlated with AMPA, the different forms of phosphorus, and organic matter. AMPA is ubiquitous but strongly correlated with urban areas despite the recent French regulation, restricting glyphosate to agricultural and private uses. This work helps to predict and understand metabolites present in the water resource used to craft drinking water. As the studied metabolites are difficult to remove, this project will be completed by a water treatment part.

Keywords: agricultural watershed, AMPA, metolachlor-ESA, water resource

Procedia PDF Downloads 147
13869 A Techno-Economic Simulation Model to Reveal the Relevance of Construction Process Impact Factors for External Thermal Insulation Composite System (ETICS)

Authors: Virgo Sulakatko

Abstract:

The reduction of energy consumption of the built environment has been one of the topics tackled by European Commission during the last decade. Increased energy efficiency requirements have increased the renovation rate of apartment buildings covered with External Thermal Insulation Composite System (ETICS). Due to fast and optimized application process, a large extent of quality assurance is depending on the specific activities of artisans and are often not controlled. The on-site degradation factors (DF) have the technical influence to the façade and cause future costs to the owner. Besides the thermal conductivity, the building envelope needs to ensure the mechanical resistance and stability, fire-, noise-, corrosion and weather protection, and long-term durability. As the shortcomings of the construction phase become problematic after some years, the common value of the renovation is reduced. Previous work on the subject has identified and rated the relevance of DF to the technical requirements and developed a method to reveal the economic value of repair works. The future costs can be traded off to increased the quality assurance during the construction process. The proposed framework is describing the joint simulation of the technical importance and economic value of the on-site DFs of ETICS. The model is providing new knowledge to improve the resource allocation during the construction process by enabling to identify and diminish the most relevant degradation factors and increase economic value to the owner.

Keywords: ETICS, construction technology, construction management, life cycle costing

Procedia PDF Downloads 410
13868 The Effects of Subsidised Irrigation Service Fees on Irrigation Performance in Vietnam

Authors: Trang Pham

Abstract:

Approximately 70% of the Vietnamese population lives in rural areas where the main livelihood is farming. For many years, the Vietnamese Government has been working towards improving farmers’ quality of life. In 2008, the Government issued the decree 115/2008/ND-CP to subsidize farmers’ water fees. The subsidy covers operation and management costs of major water infrastructure. Water users have only to pay for the operation and management of minor or tertiary canal systems. But the “subsidized water fee” has become contentious; there are two opposing schools of thought. One view is that the subsidy lessens the burden on farmers in terms of reducing their production costs, at the same time generating a sufficient budget for Irrigation Management Companies (IMCs) and Water User Association (WUAs). The alternate point of view is that the subsidy negatively effects irrigation performance, especially in tertiary canals. The aim of this study was to gain clear awareness of the perceptions of farmers, WUA members, and IMC staffs in regard to irrigation performance and management since the introduction of subsidies and local water fees. In order to find out how the government intervention has affected local farming communities, a series of questionnaires and interviews were administered in 2013. Four case studies were chosen which represent four different agricultural areas and four different irrigation systems in Vietnam. Interviews were conducted with IMC staffs and WUA members and questionnaires were used to gather information from farmers. The study compares the difference in operation and management costs across the four case studies both before and after the implementation of the decree. The results disclose factors behind the subsidized water fee that either allow or hinder improved irrigation performance and better irrigation management.

Keywords: water fee, irrigation performance, local farming, tertiary canal systems

Procedia PDF Downloads 306
13867 Effects of the Tomato Pomace Oil Extract on Physical and Antioxidant Properties of Gelatin Films

Authors: N. Jirukkakul, J. Sodtipinta

Abstract:

Tomatoes are widely consumed as fresh and processed products through the manufacturing industry. Therefore, tomato pomace is generated as a by-product accounting for about 5-13% of the whole tomato. Antioxidants still remain in tomato pomace and extraction of tomato oil may useful in edible film production. The edible film solution was prepared by mixing gelatin (2, 4 and 6%) with the distilled water and heating at 40oC for 30 min. Effect of tomato pomace oil was evaluated at 0, 0.5 and 1%. Film solution was poured in plate and dried overnight at 40oC before determining the physical properties, which are tensile strength, moisture content, color, solubility, and swelling power. The results showed that an increase gelatin concentration caused increasing of tensile strength, moisture content, solubility and swelling power. The edible film with tomato pomace oil extract appeared as the rough film with oil droplet dispersion. The addition of tomato pomace oil extract caused an increase in lightness, redness and yellowness, while tensile strength, moisture content, and solubility were decreased. Film with tomato pomace oil extract at 0.5 and 1% exhibited antioxidant properties but those properties were not significantly different (p<0.05) between film incorporated with tomato pomace oil extract 0.5 and 1%. The suitable condition for film production in this study, 4% of gelatin and 0.5% of tomato pomace oil extract, was selected for protecting oxidation of palm oil. At 15 days of the storage period, the palm oil which covered by gelatin film with tomato pomace oil extract had 22.45 milliequivalents/kg of peroxide value (PV), while, the palm oil which covered by polypropylene film and control had 24.79 and 26.67 milliequivalents/kg, respectively. Therefore, incorporation of tomato pomace oil extract in gelatin film was able to protect the oxidation of food products with high fat content.

Keywords: antioxidant, gelatin films, physical properties, tomato oil extract

Procedia PDF Downloads 265
13866 Core Stability Index for Healthy Young Sri Lankan Population

Authors: V. M. B. K. T. Malwanage, S. Samita

Abstract:

Core stability is one of the major determinants that contribute to preventing injuries, enhance performance, and improve quality of life of the human. Endurance of the four major muscle groups of the central ‘core’ of the human body is identified as the most reliable determinant of core stability amongst the other numerous causes which contribute to readily make one’s core stability. This study aimed to develop a ‘Core Stability Index’ to confer a single value for an individual’s core stability based on the four endurance test scores. Since it is possible that at least some of the test scores are not independent, possibility of constructing a single index using the multivariate method exploratory factor analysis was investigated in the study. The study sample was consisted of 400 healthy young individuals with the mean age of 23.74 ± 1.51 years and mean BMI (Body Mass Index) of 21.1 ± 4.18. The correlation analysis revealed highly significant (P < 0.0001) correlations between test scores and thus construction an index using these highly inter related test scores using the technique factor analysis was justified. The mean values of all test scores were significantly different between males and females (P < 0.0001), and therefore two separate core stability indices were constructed for the two gender groups. Moreover, having eigen values 3.103 and 2.305 for males and females respectively, indicated one factor exists for all four test scores and thus a single factor based index was constructed. The 95% reference intervals constructed using the index scores were -1.64 to 2.00 and -1.56 to 2.29 for males and females respectively. These intervals can effectively be used to diagnose those who need improvement in core stability. The practitioners should find that with a single value measure, they could be more consistent among themselves.

Keywords: construction of indices, endurance test scores, muscle endurance, quality of life

Procedia PDF Downloads 149
13865 Change Point Analysis in Average Ozone Layer Temperature Using Exponential Lomax Distribution

Authors: Amjad Abdullah, Amjad Yahya, Bushra Aljohani, Amani Alghamdi

Abstract:

Change point detection is an important part of data analysis. The presence of a change point refers to a significant change in the behavior of a time series. In this article, we examine the detection of multiple change points of parameters of the exponential Lomax distribution, which is broad and flexible compared with other distributions while fitting data. We used the Schwarz information criterion and binary segmentation to detect multiple change points in publicly available data on the average temperature in the ozone layer. The change points were successfully located.

Keywords: binary segmentation, change point, exponentialLomax distribution, information criterion

Procedia PDF Downloads 163
13864 Consortium Blockchain-based Model for Data Management Applications in the Healthcare Sector

Authors: Teo Hao Jing, Shane Ho Ken Wae, Lee Jin Yu, Burra Venkata Durga Kumar

Abstract:

Current distributed healthcare systems face the challenge of interoperability of health data. Storing electronic health records (EHR) in local databases causes them to be fragmented. This problem is aggravated as patients visit multiple healthcare providers in their lifetime. Existing solutions are unable to solve this issue and have caused burdens to healthcare specialists and patients alike. Blockchain technology was found to be able to increase the interoperability of health data by implementing digital access rules, enabling uniformed patient identity, and providing data aggregation. Consortium blockchain was found to have high read throughputs, is more trustworthy, more secure against external disruptions and accommodates transactions without fees. Therefore, this paper proposes a blockchain-based model for data management applications. In this model, a consortium blockchain is implemented by using a delegated proof of stake (DPoS) as its consensus mechanism. This blockchain allows collaboration between users from different organizations such as hospitals and medical bureaus. Patients serve as the owner of their information, where users from other parties require authorization from the patient to view their information. Hospitals upload the hash value of patients’ generated data to the blockchain, whereas the encrypted information is stored in a distributed cloud storage.

Keywords: blockchain technology, data management applications, healthcare, interoperability, delegated proof of stake

Procedia PDF Downloads 124
13863 Knowledge Management in Agro-Alimentary Companies in Algeria

Authors: Radia Bernaoui, Mohamed Hassoun

Abstract:

Our survey deals a theme of the measurement of the management knowledge of actors in Algerian agricultural sector, through a study carried out with professionals affiliated to agro-alimentary 'agribusinesses'. Taking into account the creation of a national device of information on the agronomic research in Algeria, the aim is to analyze their informational practices and to assess how they rate the sharing of knowledge and the process of collective intelligence. The results of our study reveal a more crucial need: The creation a suitable framework to the division of the knowledge, to produce 'knowledge shared social' where the scientific community could interact with firms. It is a question of promoting processes for the adaptation and the spreading of knowledge, through a partnership between the R&D sector and the production one, to increase the competitiveness of the firms, even the sustainable development of the country.

Keywords: knowledge management, pole of competitiveness, knowledge management, economy of knowledge, agro-alimentary, agribusiness, information system, Algeria

Procedia PDF Downloads 317
13862 Effect of Laser Ablation OTR Films and High Concentration Carbon Dioxide for Maintaining the Freshness of Strawberry ‘Maehyang’ for Export in Modified Atmosphere Condition

Authors: Hyuk Sung Yoon, In-Lee Choi, Min Jae Jeong, Jun Pill Baek, Ho-Min Kang

Abstract:

This study was conducted to improve storability by using suitable laser ablation oxygen transmission rate (OTR) films and effectiveness of high carbon dioxide at strawberry 'Maehyang' for export. Strawberries were grown by hydroponic system in Gyeongsangnam-do province. These strawberries were packed by different laser ablation OTR films (Daeryung Co., Ltd.) such as 1,300 cc, 20,000 cc, 40,000 cc, 80,000 cc, and 100,000 cc•m-2•day•atm. And CO2 injection (30%) treatment was used 20,000 cc•m-2•day•atm OTR film and perforated film was as a control. Temperature conditions were applied simulated shipping and distribution conditions from Korea to Singapore, there were stored at 3 ℃ (13 days), 10 ℃ (an hour), and 8 ℃ (7 days) for 20 days. Fresh weight loss rate was under 1% as maximum permissible weight loss in treated OTR films except perforated film as a control during storage. Carbon dioxide concentration within a package for the storage period showed a lower value than the maximum CO2 concentration tolerated range (15 %) in treated OTR films and even the concentration of high OTR film treatment; from 20,000cc to 100,000cc were less than 3%. 1,300 cc had a suitable carbon dioxide range as over 5 % under 15 % at 5 days after storage until finished experiments and CO2 injection treatment was quickly drop the 15 % at storage after 1 day, but it kept around 15 % during storage. Oxygen concentration was maintained between 10 to 15 % in 1,300 cc and CO2 injection treatments, but other treatments were kept in 19 to 21 %. Ethylene concentration was showed very higher concentration at the CO2 injection treatment than OTR treatments. In the OTR treatments, 1,300 cc showed the highest concentration in ethylene and 20,000 cc film had lowest. Firmness was maintained highest in 1,300cc, but there was not shown any significant differences among other OTR treatments. Visual quality had shown the best result in 20,000 cc that showed marketable quality until 20 days after storage. 20,000 cc and perforated film had better than other treatments in off-odor and the 1,300 cc and CO2 injection treatments have occurred strong off-odor even after 10 minutes. As a result of the difference between Hunter ‘L’ and ‘a’ values of chroma meter, the 1,300cc and CO2 injection treatments were delayed color developments and other treatments did not shown any significant differences. The results indicate that effectiveness for maintaining the freshness was best achieved at 20,000 cc•m-2•day•atm. Although 1,300 cc and CO2 injection treatments were in appropriate MA condition, it showed darkening of strawberry calyx and excessive reduction of coloring due to high carbon dioxide concentration during storage. While 1,300cc and CO2 injection treatments were considered as appropriate treatments for exports to Singapore, but the result was shown different. These results are based on cultivar characteristics of strawberry 'Maehyang'.

Keywords: carbon dioxide, firmness, shelf-life, visual quality

Procedia PDF Downloads 386
13861 Effective Dose and Size Specific Dose Estimation with and without Tube Current Modulation for Thoracic Computed Tomography Examinations: A Phantom Study

Authors: S. Gharbi, S. Labidi, M. Mars, M. Chelli, F. Ladeb

Abstract:

The purpose of this study is to reduce radiation dose for chest CT examination by including Tube Current Modulation (TCM) to a standard CT protocol. A scan of an anthropomorphic male Alderson phantom was performed on a 128-slice scanner. The estimation of effective dose (ED) in both scans with and without mAs modulation was done via multiplication of Dose Length Product (DLP) to a conversion factor. Results were compared to those measured with a CT-Expo software. The size specific dose estimation (SSDE) values were obtained by multiplication of the volume CT dose index (CTDIvol) with a conversion size factor related to the phantom’s effective diameter. Objective assessment of image quality was performed with Signal to Noise Ratio (SNR) measurements in phantom. SPSS software was used for data analysis. Results showed including CARE Dose 4D; ED was lowered by 48.35% and 51.51% using DLP and CT-expo, respectively. In addition, ED ranges between 7.01 mSv and 6.6 mSv in case of standard protocol, while it ranges between 3.62 mSv and 3.2 mSv with TCM. Similar results are found for SSDE; dose was higher without TCM of 16.25 mGy and was lower by 48.8% including TCM. The SNR values calculated were significantly different (p=0.03<0.05). The highest one is measured on images acquired with TCM and reconstructed with Filtered back projection (FBP). In conclusion, this study proves the potential of TCM technique in SSDE and ED reduction and in conserving image quality with high diagnostic reference level for thoracic CT examinations.

Keywords: anthropomorphic phantom, computed tomography, CT-expo, radiation dose

Procedia PDF Downloads 206
13860 Developing an Instrument to Measure Teachers’ Self-Efficacy of Teaching Innovation Skills

Authors: Huda S. Al-Azmi

Abstract:

There is a growing consensus that adoption of teachers’ self-efficacy measurement tools help to assess teachers’ abilities in specific areas in order to improve their skills. As a result, different instruments to assess teachers’ ability were developed by academics and practitioners. However, many of these instruments focused either on general teaching skills, or on the other hand, were very specific to one subject. As such, these instruments do not offer a tool to measure the ability of teachers in teaching 21st century skills such as innovation skills. Teaching innovation skills helps to prepare students for lives and careers in the 21st century. The purpose of this study is to develop an instrument measuring teachers’ self-efficacy of teaching innovation skills related to the classroom context and evaluating the teachers’ beliefs regarding their ability in teaching innovation skills. To reach this goal, the 16-item instrument measures four dimensions of innovation skills: creativity, critical thinking, communication, and collaboration. 211 secondary-school teachers filled out the survey to quantitatively analyze the quality of the instrument. The instrument’s reliability and item analysis were measured by using jMetrik. The results concluded that the mean of self-efficacy ranged from 3 to 3.6 without extreme high or low self-efficacy scores. The discrimination analysis revealed that one item recorded a negative correlation with the total, and three items recorded low correlation with the total. The reliabilities of items ranged from 0.64 to 0.69 and the instrument needed a couple of revisions before practical use. The study concluded the need to discard one item and revise five items to increase the quality of the instrument for future work.

Keywords: critical thinking, collaboration, innovation skills, self-efficacy

Procedia PDF Downloads 200
13859 Adaptive Energy-Aware Routing (AEAR) for Optimized Performance in Resource-Constrained Wireless Sensor Networks

Authors: Innocent Uzougbo Onwuegbuzie

Abstract:

Wireless Sensor Networks (WSNs) are crucial for numerous applications, yet they face significant challenges due to resource constraints such as limited power and memory. Traditional routing algorithms like Dijkstra, Ad hoc On-Demand Distance Vector (AODV), and Bellman-Ford, while effective in path establishment and discovery, are not optimized for the unique demands of WSNs due to their large memory footprint and power consumption. This paper introduces the Adaptive Energy-Aware Routing (AEAR) model, a solution designed to address these limitations. AEAR integrates reactive route discovery, localized decision-making using geographic information, energy-aware metrics, and dynamic adaptation to provide a robust and efficient routing strategy. We present a detailed comparative analysis using a dataset of 50 sensor nodes, evaluating power consumption, memory footprint, and path cost across AEAR, Dijkstra, AODV, and Bellman-Ford algorithms. Our results demonstrate that AEAR significantly reduces power consumption and memory usage while optimizing path weight. This improvement is achieved through adaptive mechanisms that balance energy efficiency and link quality, ensuring prolonged network lifespan and reliable communication. The AEAR model's superior performance underlines its potential as a viable routing solution for energy-constrained WSN environments, paving the way for more sustainable and resilient sensor network deployments.

Keywords: wireless sensor networks (WSNs), adaptive energy-aware routing (AEAR), routing algorithms, energy, efficiency, network lifespan

Procedia PDF Downloads 13
13858 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems

Authors: Bassam Istanbouli

Abstract:

With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them.  In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies;  the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system.  Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.

Keywords: blueprint, ERP, modular, normalized

Procedia PDF Downloads 125
13857 Pregnant Women and Mothers in Prison, Mother and Baby Units and Mental Health

Authors: Rachel Dolan

Abstract:

Background: Over two thirds of women in prison in England are mothers, and estimates suggest between 100 and 200 women per year give birth during imprisonment. There are currently six mother and baby units (MBUs) in prisons in England which admit women and babies up to the age of 18 months. Although there are only 65 places available, and despite positive impacts, they are rarely full. Mental illness may influence the number of admissions, as may interpretation of admission criteria. They are the only current alternative to separation for imprisoned mothers and their babies. Aims: To identify the factors that affect the decision to apply for/be offered a place in a prison MBU; to measure the impact of a placement upon maternal mental health and wellbeing; To measure the Initial outcomes for mother and child. Methods: A mixed methods approach - 100 pregnant women in English prisons are currently being recruited from prisons in England. Quantitative measures will establish the prevalence of mental disorder, personality disorder, substance misuse and quality of life. Qualitative interviews will document the experiences of pregnancy and motherhood in prison. Results: Preliminary quantitative findings suggest the most prevalent mental disorders are anxiety and depression and approximately half the participants meet the criteria for one or more personality disorders. The majority of participants to date have been offered a place in a prison MBU, and those in a prison with an MBU prior to applying are more likely to be admitted. Those with a previous history of childcare issues, who are known to social services are less likely to be offered a place. Qualitative findings suggest that many women are often hungry and uncomfortable during pregnancy, many have feelings of guilt about having a child in prison and that feelings of anxiety and worry are exacerbated by lack of information.

Keywords: mothers, prison, mother and baby units, mental health

Procedia PDF Downloads 269
13856 The Use of TRIZ to Map the Evolutive Pattern of Products

Authors: Fernando C. Labouriau, Ricardo M. Naveiro

Abstract:

This paper presents a model for mapping the evolutive pattern of products in order to generate new ideas, to perceive emerging technologies and to manage product’s portfolios in new product development (NPD). According to the proposed model, the information extracted from the patent system is filtered and analyzed with TRIZ tools to produce the input information to the NPD process. The authors acknowledge that the NPD process is well integrated within the enterprises business strategic planning and that new products are vital in the competitive market nowadays. In the other hand, it has been observed the proactive use of patent information in some methodologies for selecting projects, mapping technological change and generating product concepts. And one of these methodologies is TRIZ, a theory created to favor innovation and to improve product design that provided the analytical framework for the model. Initially, it is presented an introduction to TRIZ mainly focused on the patterns of evolution of technical systems and its strategic uses, a brief and absolutely non-comprehensive description as the theory has several others tools being widely employed in technical and business applications. Then, it is introduced the model for mapping the products evolutive pattern with its three basic pillars, namely patent information, TRIZ and NPD, and the methodology for implementation. Following, a case study of a Brazilian bike manufacturing is presented to proceed the mapping of a product evolutive pattern by decomposing and analyzing one of its assemblies along ten evolution lines in order to envision opportunities for further product development. Some of these lines are illustrated in more details to evaluate the features of the product in relation to the TRIZ concepts using a comparison perspective with patents in the state of the art to validate the product’s evolutionary potential. As a result, the case study provided several opportunities for a product improvement development program in different project categories, identifying technical and business impacts as well as indicating the lines of evolution that can mostly benefit from each opportunity.

Keywords: product development, patents, product strategy, systems evolution

Procedia PDF Downloads 484
13855 Analysis of the Level of Production Failures by Implementing New Assembly Line

Authors: Joanna Kochanska, Dagmara Gornicka, Anna Burduk

Abstract:

The article examines the process of implementing a new assembly line in a manufacturing enterprise of the household appliances industry area. At the initial stages of the project, a decision was made that one of its foundations should be the concept of lean management. Because of that, eliminating as many errors as possible in the first phases of its functioning was emphasized. During the start-up of the line, there were identified and documented all production losses (from serious machine failures, through any unplanned downtime, to micro-stops and quality defects). During 6 weeks (line start-up period), all errors resulting from problems in various areas were analyzed. These areas were, among the others, production, logistics, quality, and organization. The aim of the work was to analyze the occurrence of production failures during the initial phase of starting up the line and to propose a method for determining their critical level during its full functionality. There was examined the repeatability of the production losses in various areas and at different levels at such an early stage of implementation, by using the methods of statistical process control. Based on the Pareto analysis, there were identified the weakest points in order to focus improvement actions on them. The next step was to examine the effectiveness of the actions undertaken to reduce the level of recorded losses. Based on the obtained results, there was proposed a method for determining the critical failures level in the studied areas. The developed coefficient can be used as an alarm in case of imbalance of the production, which is caused by the increased failures level in production and production support processes in the period of the standardized functioning of the line.

Keywords: production failures, level of production losses, new production line implementation, assembly line, statistical process control

Procedia PDF Downloads 116
13854 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand

Authors: Chukiat Chaiboonsri, Satawat Wannapan

Abstract:

This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.

Keywords: TThailand tourism, Maximum Entropy Bootstrapping approach, macroeconomic model, asymmetric information

Procedia PDF Downloads 281
13853 The Effect of Information vs. Reasoning Gap Tasks on the Frequency of Conversational Strategies and Accuracy in Speaking among Iranian Intermediate EFL Learners

Authors: Hooriya Sadr Dadras, Shiva Seyed Erfani

Abstract:

Speaking skills merit meticulous attention both on the side of the learners and the teachers. In particular, accuracy is a critical component to guarantee the messages to be conveyed through conversation because a wrongful change may adversely alter the content and purpose of the talk. Different types of tasks have served teachers to meet numerous educational objectives. Besides, negotiation of meaning and the use of different strategies have been areas of concern in socio-cultural theories of SLA. Negotiation of meaning is among the conversational processes which have a crucial role in facilitating the understanding and expression of meaning in a given second language. Conversational strategies are used during interaction when there is a breakdown in communication that leads to the interlocutor attempting to remedy the gap through talk. Therefore, this study was an attempt to investigate if there was any significant difference between the effect of reasoning gap tasks and information gap tasks on the frequency of conversational strategies used in negotiation of meaning in classrooms on one hand, and on the accuracy in speaking of Iranian intermediate EFL learners on the other. After a pilot study to check the practicality of the treatments, at the outset of the main study, the Preliminary English Test was administered to ensure the homogeneity of 87 out of 107 participants who attended the intact classes of a 15 session term in one control and two experimental groups. Also, speaking sections of PET were used as pretest and posttest to examine their speaking accuracy. The tests were recorded and transcribed to estimate the percentage of the number of the clauses with no grammatical errors in the total produced clauses to measure the speaking accuracy. In all groups, the grammatical points of accuracy were instructed and the use of conversational strategies was practiced. Then, different kinds of reasoning gap tasks (matchmaking, deciding on the course of action, and working out a time table) and information gap tasks (restoring an incomplete chart, spot the differences, arranging sentences into stories, and guessing game) were manipulated in experimental groups during treatment sessions, and the students were required to practice conversational strategies when doing speaking tasks. The conversations throughout the terms were recorded and transcribed to count the frequency of the conversational strategies used in all groups. The results of statistical analysis demonstrated that applying both the reasoning gap tasks and information gap tasks significantly affected the frequency of conversational strategies through negotiation. In the face of the improvements, the reasoning gap tasks had a more significant impact on encouraging the negotiation of meaning and increasing the number of conversational frequencies every session. The findings also indicated both task types could help learners significantly improve their speaking accuracy. Here, applying the reasoning gap tasks was more effective than the information gap tasks in improving the level of learners’ speaking accuracy.

Keywords: accuracy in speaking, conversational strategies, information gap tasks, reasoning gap tasks

Procedia PDF Downloads 296
13852 From Responses of Macroinvertebrate Metrics to the Definition of Reference Thresholds

Authors: Hounyèmè Romuald, Mama Daouda, Argillier Christine

Abstract:

The present study focused on the use of benthic macrofauna to define the reference state of an anthropized lagoon (Nokoué-Benin) from the responses of relevant metrics to proxies. The approach used is a combination of a joint species distribution model and Bayesian networks. The joint species distribution model was used to select the relevant metrics and generate posterior probabilities that were then converted into posterior response probabilities for each of the quality classes (pressure levels), which will constitute the conditional probability tables allowing the establishment of the probabilistic graph representing the different causal relationships between metrics and pressure proxies. For the definition of the reference thresholds, the predicted responses for low-pressure levels were read via probability density diagrams. Observations collected during high and low water periods spanning 03 consecutive years (2004-2006), sampling 33 macroinvertebrate taxa present at all seasons and sampling points, and measurements of 14 environmental parameters were used as application data. The study demonstrated reliable inferences, selection of 07 relevant metrics and definition of quality thresholds for each environmental parameter. The relevance of the metrics as well as the reference thresholds for ecological assessment despite the small sample size, suggests the potential for wider applicability of the approach for aquatic ecosystem monitoring and assessment programs in developing countries generally characterized by a lack of monitoring data.

Keywords: pressure proxies, bayesian inference, bioindicators, acadjas, functional traits

Procedia PDF Downloads 70