Search results for: Minimum data set
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26489

Search results for: Minimum data set

26159 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing

Authors: Rowan P. Martnishn

Abstract:

During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.

Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding

Procedia PDF Downloads 27
26158 Optimal Evaluation of Weather Risk Insurance for Wheat

Authors: Slim Amami

Abstract:

A model is developed to prevent the risks related to climate conditions in the agricultural sector. It will determine the yearly optimum premium to be paid by a farmer in order to reach his required turnover. The model is mainly based on both climatic stability and 'soft' responses of usually grown species to average climate variations at the same place and inside a safety ball which can be determined from past meteorological data. This allows the use of linear regression expression for dependence of production result in terms of driving meteorological parameters, main ones of which are daily average sunlight, rainfall and temperature. By a simple best parameter fit from the expert table drawn with professionals, optimal representation of yearly production is deduced from records of previous years, and yearly payback is evaluated from minimum yearly produced turnover. Optimal premium is then deduced, and gives the producer a useful bound for negotiating an offer by insurance companies to effectively protect their harvest. The application to wheat production in the French Oise department illustrates the reliability of the present model with as low as 6% difference between predicted and real data. The model can be adapted to almost every agricultural field by changing state parameters and calibrating their associated coefficients.

Keywords: agriculture, database, meteorological factors, production model, optimal price

Procedia PDF Downloads 221
26157 Phytochemial Screening, Anti-Microbial, and Minerals Determination of Leptadenia Hastata

Authors: I. L. Ibrahim, A. Mann, B. A. Adam

Abstract:

This project involved screening for antibacterial activity, phytochemical and mineral properties of Leptadenia hastata by flame photometry. The result of phytochemical screening reveals that the presence of flavonoids, tannins, saponins, alkaloids, steroidal, and anthraquinones while the cardiac glycoside was absent. This justifies the plant been used as anti-bleeding and anti-inflammatory agents. The result of flame photometry revealed that 1.85 % (Na), 0.65% (K) and 1.85 % (Ca) which indicates the safe nature of the plant extract as such could be used to lower high blood pressure. The antibacterial properties of both the aqueous and ethanolic extract were studied against some bacteria, Escherichia coli, Bacillus Cercus, Pseudomonas aeruginas, and Enterobacter aerogegens, by disc diffusion method and the result reveals that there are very good activities against the organism while the ethanolic extract at concentration 1.0 – 1.2 mg/ml. the ethanolic extract showed in considerable zone inhibition against bacteria’s; Escherichia coli, Bacillus Cercus, pseudomonas aeruginosa andklebsellapnemuoniae. Minimum inhibitory concentration (MIC) and minimum Bacterial concentration (MBC) were conducted with fairly good significant effect of inhibition on the organism, therefore, plant extract could be a potential source of antibacterial agent.

Keywords: antibacterial activity, Leptadenia hastata, infectious diseases, phytochemical screening

Procedia PDF Downloads 587
26156 Uncertainty Quantification of Corrosion Anomaly Length of Oil and Gas Steel Pipelines Based on Inline Inspection and Field Data

Authors: Tammeen Siraj, Wenxing Zhou, Terry Huang, Mohammad Al-Amin

Abstract:

The high resolution inline inspection (ILI) tool is used extensively in the pipeline industry to identify, locate, and measure metal-loss corrosion anomalies on buried oil and gas steel pipelines. Corrosion anomalies may occur singly (i.e. individual anomalies) or as clusters (i.e. a colony of corrosion anomalies). Although the ILI technology has advanced immensely, there are measurement errors associated with the sizes of corrosion anomalies reported by ILI tools due limitations of the tools and associated sizing algorithms, and detection threshold of the tools (i.e. the minimum detectable feature dimension). Quantifying the measurement error in the ILI data is crucial for corrosion management and developing maintenance strategies that satisfy the safety and economic constraints. Studies on the measurement error associated with the length of the corrosion anomalies (in the longitudinal direction of the pipeline) has been scarcely reported in the literature and will be investigated in the present study. Limitations in the ILI tool and clustering process can sometimes cause clustering error, which is defined as the error introduced during the clustering process by including or excluding a single or group of anomalies in or from a cluster. Clustering error has been found to be one of the biggest contributory factors for relatively high uncertainties associated with ILI reported anomaly length. As such, this study focuses on developing a consistent and comprehensive framework to quantify the measurement errors in the ILI-reported anomaly length by comparing the ILI data and corresponding field measurements for individual and clustered corrosion anomalies. The analysis carried out in this study is based on the ILI and field measurement data for a set of anomalies collected from two segments of a buried natural gas pipeline currently in service in Alberta, Canada. Data analyses showed that the measurement error associated with the ILI-reported length of the anomalies without clustering error, denoted as Type I anomalies is markedly less than that for anomalies with clustering error, denoted as Type II anomalies. A methodology employing data mining techniques is further proposed to classify the Type I and Type II anomalies based on the ILI-reported corrosion anomaly information.

Keywords: clustered corrosion anomaly, corrosion anomaly assessment, corrosion anomaly length, individual corrosion anomaly, metal-loss corrosion, oil and gas steel pipeline

Procedia PDF Downloads 307
26155 On-Ice Force-Velocity Modeling Technical Considerations

Authors: Dan Geneau, Mary Claire Geneau, Seth Lenetsky, Ming -Chang Tsai, Marc Klimstra

Abstract:

Introduction— Horizontal force-velocity profiling (HFVP) involves modeling an athletes linear sprint kinematics to estimate valuable maximum force and velocity metrics. This approach to performance modeling has been used in field-based team sports and has recently been introduced to ice-hockey as a forward skating performance assessment. While preliminary data has been collected on ice, distance constraints of the on-ice test restrict the ability of the athletes to reach their maximal velocity which result in limits of the model to effectively estimate athlete performance. This is especially true of more elite athletes. This report explores whether athletes on-ice are able to reach a velocity plateau similar to what has been seen in overground trials. Fourteen male Major Junior ice-hockey players (BW= 83.87 +/- 7.30 kg, height = 188 ± 3.4cm cm, age = 18 ± 1.2 years n = 14) were recruited. For on-ice sprints, participants completed a standardized warm-up consisting of skating and dynamic stretching and a progression of three skating efforts from 50% to 95%. Following the warm-up, participants completed three on ice 45m sprints, with three minutes of rest in between each trial. For overground sprints, participants completed a similar dynamic warm-up to that of on-ice trials. Following the warm-up participants completed three 40m overground sprint trials. For each trial (on-ice and overground), radar was used to collect instantaneous velocity (Stalker ATS II, Texas, USA) aimed at the participant’s waist. Sprint velocities were modelled using custom Python (version 3.2) script using a mono-exponential function, similar to previous work. To determine if on-ice tirals were achieving a maximum velocity (plateau), minimum acceleration values of the modeled data at the end of the sprint were compared (using paired t-test) between on-ice and overground trials. Significant differences (P<0.001) between overground and on-ice minimum accelerations were observed. It was found that on-ice trials consistently reported higher final acceleration values, indicating a maximum maintained velocity (plateau) had not been reached. Based on these preliminary findings, it is suggested that reliable HFVP metrics cannot yet be collected from all ice-hockey populations using current methods. Elite male populations were not able to achieve a velocity plateau similar to what has been seen in overground trials, indicating the absence of a maximum velocity measure. With current velocity and acceleration modeling techniques, including a dependency of a velocity plateau, these results indicate the potential for error in on-ice HFVP measures. Therefore, these findings suggest that a greater on-ice sprint distance may be required or the need for other velocity modeling techniques, where maximal velocity is not required for a complete profile.   

Keywords: ice-hockey, sprint, skating, power

Procedia PDF Downloads 98
26154 Rheological Behavior of Fresh Activated Sludge

Authors: Salam K. Al-Dawery

Abstract:

Despite of few research works on municipal sludge, still there is a lack of actual data. Thus, this work was focused on the conditioning and rheology of fresh activated sludge. The effect of cationic polyelectrolyte has been investigated at different concentrations and pH values in a comparative fashion. Yield stress is presented in all results indicating the minimum stress that necessary to reach flow conditions. Connections between particle-particle is the reason for this yield stress, also, the addition of polyelectrolyte causes strong bonds between particles and water resulting in the aggregation of particles which required higher shear stress in order to flow. The results from the experiments indicate that the cationic polyelectrolytes have significant effluence on the sludge characteristic and water quality such as turbidity, SVI, zone settling rate and shear stress.

Keywords: rheology, polyelectrolyte, settling volume index, turbidity

Procedia PDF Downloads 355
26153 Optimizing the Performance of Thermoelectric for Cooling Computer Chips Using Different Types of Electrical Pulses

Authors: Saleh Alshehri

Abstract:

Thermoelectric technology is currently being used in many industrial applications for cooling, heating and generating electricity. This research mainly focuses on using thermoelectric to cool down high-speed computer chips at different operating conditions. A previously developed and validated three-dimensional model for optimizing and assessing the performance of cascaded thermoelectric and non-cascaded thermoelectric is used in this study to investigate the possibility of decreasing the hotspot temperature of computer chip. Additionally, a test assembly is built and tested at steady-state and transient conditions. The obtained optimum thermoelectric current at steady-state condition is used to conduct a number of pulsed tests (i.e. transient tests) with different shapes to cool the computer chips hotspots. The results of the steady-state tests showed that at hotspot heat rate of 15.58 W (5.97 W/cm2), using thermoelectric current of 4.5 A has resulted in decreasing the hotspot temperature at open circuit condition (89.3 °C) by 50.1 °C. Maximum and minimum hotspot temperatures have been affected by ON and OFF duration of the electrical current pulse. Maximum hotspot temperature was resulted by longer OFF pulse period. In addition, longer ON pulse period has generated the minimum hotspot temperature.

Keywords: thermoelectric generator, TEG, thermoelectric cooler, TEC, chip hotspots, electronic cooling

Procedia PDF Downloads 141
26152 Business Intelligence for Profiling of Telecommunication Customer

Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro

Abstract:

Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.

Keywords: business intelligence, customer segmentation, data warehouse, data mining

Procedia PDF Downloads 481
26151 Investigating the Factors Affecting the Household Accounting of People in Bangkeaw Samutsongkhram

Authors: Khajeerat Phumpurk

Abstract:

This research aims to study the knowledge, attitude toward household accounting of people in Bangkeaw Samutsongkhram. The sample use in this study was the people in tambol bangkeao Mueang Samut songkhram province. The Sample size for 100 household by using the accidental sampling and data collection was a questionnaire. Statistical analysis for frequency, percentage, mean, minimum, maximum value and standard deviation. It was found that the most of the respondent are farmers for 63.4. Most of them are male, with average of age 49.54 years. The education is vocational. The average household income is 60,873.74 per year. The respondent has the debt with the bank for 64.4 percent. The greatest influence to do the household accounting of farmers is the bank expertise. All the advice about doing household accounts get it from the staff of the bank for agriculture and agricultural cooperatives. The farmers for 57.3 do household accounting during the night time.

Keywords: Bangkeaw Samutsongkhram, household accounting, knowledge, sufficiency economy

Procedia PDF Downloads 224
26150 Effect of Soil Resistivity on the Development of a Cathodic Protection System Using Zinc Anode

Authors: Chinedu F. Anochie

Abstract:

The deterioration of materials as a result of their interaction with the environment has been a huge challenge to engineering. Many steps have been taking to tackle corrosion and its effects on harmful effects on engineering materials and structures. Corrosion inhibition, coating, passivation, materials selection, and cathodic protection are some of the methods utilized to curtail the rate at which materials corrode. The use of sacrificial anodes (magnesium, aluminum, or zinc) to protect the metal of interest is a widespread technique used to prevent corrosion in underground structures, ship hauls, and other structures susceptible to corrosion attack. However, certain factors, like resistivity, affect the performance of sacrificial anodes. To establish the effect of soil resistivity on the effectiveness of a cathodic protection system, a mild steel specimen was cathodically protected around Workshop 2 area, Federal University of Technology, Owerri, Nigeria. Design calculations showed that one zinc anode was sufficient to protect the pipe. The specimen (mild steel pipe) was coated with white and black polykene tapes and was subsequently buried in a high resistivity soil. The pipe-to-soil potential measurements were obtained using a digital fluke multimeter. The protection potential obtained on installation was higher than the minimum protection criteria. However, the potential results obtained over a fourteen-day intervals continually decreased to a value significantly lower than the minimum protection criteria. This showed that the sacrificial anode (zinc) was rendered ineffective by the high resistivity of the area of installation. It has been shown that the resistivity of the soil has a marked effect on the feasibility of cathodic protection systems. This work justified that zinc anode cannot be used for cathodic protection around Workshop 2 area, Federal University of Technology, Owerri, Nigeria, because of the high resistivity of the area. An experimental data which explains the effectiveness of galvanic anode cathodic protection system on corrosion control of a small steel structure, exposed to a soil of high resistivity has been established.

Keywords: cathodic protection, corrosion, pipe, sacrificial anode

Procedia PDF Downloads 183
26149 Fractal-Wavelet Based Techniques for Improving the Artificial Neural Network Models

Authors: Reza Bazargan lari, Mohammad H. Fattahi

Abstract:

Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for pre-processing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based pre-processing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.

Keywords: wavelet, de-noising, predictability, time series fractal analysis, valid length, ANN

Procedia PDF Downloads 368
26148 Distributed Cost-Based Scheduling in Cloud Computing Environment

Authors: Rupali, Anil Kumar Jaiswal

Abstract:

Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc.  Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively.  Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.

Keywords: physical machines, virtual machines, support for repetition, self-healing, highly scalable programming model

Procedia PDF Downloads 167
26147 Nonlinear Optics of Dirac Fermion Systems

Authors: Vipin Kumar, Girish S. Setlur

Abstract:

Graphene has been recognized as a promising 2D material with many new properties. However, pristine graphene is gapless which hinders its direct application towards graphene-based semiconducting devices. Graphene is a zero-gapp and linearly dispersing semiconductor. Massless charge carriers (quasi-particles) in graphene obey the relativistic Dirac equation. These Dirac fermions show very unusual physical properties such as electronic, optical and transport. Graphene is analogous to two-level atomic systems and conventional semiconductors. We may expect that graphene-based systems will also exhibit phenomena that are well-known in two-level atomic systems and in conventional semiconductors. Rabi oscillation is a nonlinear optical phenomenon well-known in the context of two-level atomic systems and also in conventional semiconductors. It is the periodic exchange of energy between the system of interest and the electromagnetic field. The present work describes the phenomenon of Rabi oscillations in graphene based systems. Rabi oscillations have already been described theoretically and experimentally in the extensive literature available on this topic. To describe Rabi oscillations they use an approximation known as rotating wave approximation (RWA) well-known in studies of two-level systems. RWA is valid only near conventional resonance (small detuning)- when the frequency of the external field is nearly equal to the particle-hole excitation frequency. The Rabi frequency goes through a minimum close to conventional resonance as a function of detuning. Far from conventional resonance, the RWA becomes rather less useful and we need some other technique to describe the phenomenon of Rabi oscillation. In conventional systems, there is no second minimum - the only minimum is at conventional resonance. But in graphene we find anomalous Rabi oscillations far from conventional resonance where the Rabi frequency goes through a minimum that is much smaller than the conventional Rabi frequency. This is known as anomalous Rabi frequency and is unique to graphene systems. We have shown that this is attributable to the pseudo-spin degree of freedom in graphene systems. A new technique, which is an alternative to RWA called asymptotic RWA (ARWA), has been invoked by our group to discuss the phenomenon of Rabi oscillation. Experimentally accessible current density shows different types of threshold behaviour in frequency domain close to the anomalous Rabi frequency depending on the system chosen. For single layer graphene, the exponent at threshold is equal to 1/2 while in case of bilayer graphene, it is computed to be equal to 1. Bilayer graphene shows harmonic (anomalous) resonances absent in single layer graphene. The effect of asymmetry and trigonal warping (a weak direct inter-layer hopping in bilayer graphene) on these oscillations is also studied in graphene systems. Asymmetry has a remarkable effect only on anomalous Rabi oscillations whereas the Rabi frequency near conventional resonance is not significantly affected by the asymmetry parameter. In presence of asymmetry, these graphene systems show Rabi-like oscillations (offset oscillations) even for vanishingly small applied field strengths (less than the gap parameter). The frequency of offset oscillations may be identified with the asymmetry parameter.

Keywords: graphene, Bilayer graphene, Rabi oscillations, Dirac fermion systems

Procedia PDF Downloads 293
26146 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics

Procedia PDF Downloads 572
26145 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework

Authors: Lutful Karim, Mohammed S. Al-kahtani

Abstract:

Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.

Keywords: big data, clustering, tree topology, data aggregation, sensor networks

Procedia PDF Downloads 343
26144 Recent Climate Variability and Crop Production in the Central Highlands of Ethiopia

Authors: Arragaw Alemayehu, Woldeamlak Bewket

Abstract:

The aim of this study was to understand the influence of current climate variability on crop production in the central highlands of Ethiopia. We used monthly rainfall and temperature data from 132 points each representing a pixel of 10×10 km. The data are reconstructions based on station records and meteorological satellite observations. Production data of the five major crops in the area were collected from the Central Statistical Agency for the period 2004-2013 and for the main cropping season, locally known as Meher. The production data are at the Enumeration Area (EA ) level and hence the best available dataset on crop production. The results show statistically significant decreasing trends in March–May (Belg) rainfall in the area. However, June – September (Kiremt) rainfall showed increasing trends in Efratana Gidim and Menz Gera Meder which the latter is statistically significant. Annual rainfall also showed positive trends in the area except Basona Werana where significant negative trends were observed. On the other hand, maximum and minimum temperatures showed warming trends in the study area. Correlation results have shown that crop production and area of cultivation have positive correlation with rainfall, and negative with temperature. When the trends in crop production are investigated, most crops showed negative trends and below average production was observed. Regression results have shown that rainfall was the most important determinant of crop production in the area. It is concluded that current climate variability has a significant influence on crop production in the area and any unfavorable change in the local climate in the future will have serious implications for household level food security. Efforts to adapt to the ongoing climate change should begin from tackling the current climate variability and take a climate risk management approach.

Keywords: central highlands, climate variability, crop production, Ethiopia, regression, trend

Procedia PDF Downloads 434
26143 Challenges of Implementing Participatory Irrigation Management for Food Security in Semi Arid Areas of Tanzania

Authors: Pilly Joseph Kagosi

Abstract:

The study aims at assessing challenges observed during the implementation of participatory irrigation management (PIM) approach for food security in semi-arid areas of Tanzania. Data were collected through questionnaire, PRA tools, key informants discussion, Focus Group Discussion (FGD), participant observation, and literature review. Data collected from the questionnaire was analysed using SPSS while PRA data was analysed with the help of local communities during PRA exercise. Data from other methods were analysed using content analysis. The study revealed that PIM approach has a contribution in improved food security at household level due to the involvement of communities in water management activities and decision making which enhanced the availability of water for irrigation and increased crop production. However, there were challenges observed during the implementation of the approach including; minimum participation of beneficiaries in decision-making during planning and designing stages, meaning inadequate devolution of power among scheme owners. Inadequate and lack of transparency on income expenditure in Water Utilization Associations’ (WUAs), water conflict among WUAs members, conflict between farmers and livestock keepers and conflict between WUAs leaders and village government regarding training opportunities and status; WUAs rules and regulation are not legally recognized by the National court and few farmers involved in planting trees around water sources. However, it was realized that some of the mentioned challenges were rectified by farmers themselves facilitated by government officials. The study recommends that the identified challenges need to be rectified for farmers to realize impotence of PIM approach as it was realized by other Asian countries.

Keywords: challenges, participatory approach, irrigation management, food security, semi arid areas

Procedia PDF Downloads 323
26142 Exploring the Effect of Cellulose Based Coating Incorporated with CaCl2 and MgSO4 on Shelf Life Extension of Kinnow (Citrus reticulata blanco) Cultivar

Authors: Muhammad Atif Randhawa, Muhammad Nadeem

Abstract:

Kinnow (Citrus reticulate Blanco) is nutritious and perishable fruit with high juice content, and also rich source of vitamin-C. In Pakistan, kinnow export is limited due to inadequate post-harvest handling and lack of satisfactory storage practices. Considering these issues, the present study was designed to evaluate the effect of hydroxypropyl methylcellulose (HPMC) coating in combination with CaCl2 and MgSO4 on shelf life extension of kinnow. Fruits were treated with different levels of CaCl2 and MgSO4 followed by HPMC coating (3 and 5%) and stored at 10°C with 80% relative humidity for 6 weeks. Fruits were analyzed for various physico-chemical parameters on weekly basis. During this study lower fruit firmness (0.24Nm-2), loss in weight (0.64%) and ethylene production (0.039 µL•kg-1•hr-1) was observed in fruits treated with 1% CaCl2 + 1% MgSO4 + 5% HPMC (T6) during storage of 42 days. Minimum chilling injury indexes 0.22% and 0.61% were recorded in treatments T4 and T6, respectively. T6 showed higher values of titerable acidity (0.29%) and ascorbic acid contents (39.82mg/100g). Minimum TSS (9.62°Brix) was found in fruits of T6. Overall T6 showed significantly better results for various parameters, as compared to all other treated and control fruits.

Keywords: firmness, kinnow coating, physicochemical, storage

Procedia PDF Downloads 428
26141 Evaluation of Surface Roughness Condition Using App Roadroid

Authors: Diego de Almeida Pereira

Abstract:

The roughness index of a road is considered the most important parameter about the quality of the pavement, as it has a close relation with the comfort and safety of the road users. Such condition can be established by means of functional evaluation of pavement surface deviations, measured by the International Roughness Index (IRI), an index that came out of the international evaluation of pavements, coordinated by the World Bank, and currently owns, as an index of limit measure, for purposes of receiving roads in Brazil, the value of 2.7 m/km. This work make use of the e.IRI parameter, obtained by the Roadroid app. for smartphones which use Android operating system. The choice of such application is due to the practicality for the user interaction, as it possesses a data storage on a cloud of its own, and the support given to universities all around the world. Data has been collected for six months, once in each month. The studies begun in March 2018, season of precipitations that worsen the conditions of the roads, besides the opportunity to accompany the damage and the quality of the interventions performed. About 350 kilometers of sections of four federal highways were analyzed, BR-020, BR-040, BR-060 and BR-070 that connect the Federal District (area where Brasilia is located) and surroundings, chosen for their economic and tourist importance, been two of them of federal and two others of private exploitation. As well as much of the road network, the analyzed stretches are coated of Hot Mix Asphalt (HMA). Thus, this present research performs a contrastive discussion between comfort conditions and safety of the roads under private exploitation in which users pay a fee to the concessionaires so they could travel on a road that meet the minimum requirements for usage, and regarding the quality of offered service on the roads under Federal Government jurisdiction. And finally, the contrast of data collected by National Department of Transport Infrastructure – DNIT, by means of a laser perfilometer, with data achieved by Roadroid, checking the applicability, the practicality and cost-effective, considering the app limitations.

Keywords: roadroid, international roughness index, Brazilian roads, pavement

Procedia PDF Downloads 83
26140 A Two-Pronged Truncated Deferred Sampling Plan for Log-Logistic Distribution

Authors: Braimah Joseph Odunayo, Jiju Gillariose

Abstract:

This paper is aimed at developing a sampling plan that uses information from precedent and successive lots for lot disposition with a pretention that the life-time of a particular product assumes a Log-logistic distribution. A Two-pronged Truncated Deferred Sampling Plan (TTDSP) for Log-logistic distribution is proposed when the testing is truncated at a precise time. The best possible sample sizes are obtained under a given Maximum Allowable Percent Defective (MAPD), Test Suspension Ratios (TSR), and acceptance numbers (c). A formula for calculating the operating characteristics of the proposed plan is also developed. The operating characteristics and mean-ratio values were used to measure the performance of the plan. The findings of the study show that: Log-logistic distribution has a decreasing failure rate; furthermore, as mean-life ratio increase, the failure rate reduces; the sample size increase as the acceptance number, test suspension ratios and maximum allowable percent defective increases. The study concludes that the minimum sample sizes were smaller, which makes the plan a more economical plan to adopt when cost and time of production are costly and the experiment being destructive.

Keywords: consumers risk, mean life, minimum sample size, operating characteristics, producers risk

Procedia PDF Downloads 140
26139 Phytochemical and Antimicrobial Studies of Root Bark Extracts from Glossonema boveanum (Decne.)

Authors: Ahmed Jibrin Uttu, Maimuna Waziri

Abstract:

The root bark of Glossonema boveanum (Decne), a member of Apocynaceae family, is used by traditional medicine practitioner to treat urinary and respiratory tract infections, bacteremia, typhoid fever, bacillary dysentery, diarrhea and stomach pain. This present study aims to validate the medicinal claims ascribed to the root bark of the plant. Preliminary phytochemical study of the root bark extracts (n-hexane, ethyl acetate, chloroform and methanol extracts) showed the presence of alkaloids, carbohydrates, steroids, triterpenes, cardiac glycosides, saponins, tannins and flavonoids. Antimicrobial study of the extracts showed activities against Staphylococus aureus, Bacillus subtilis, Salmonella typhii, Shigella dysenteriae, Escherichia coli, Enterobacter cloacae, Streptococcus agalactiae and Candida albicans while Micrococcus luteus, Pseudomonas aeruginosa and Klebsiella Pneumoniae showed resistance to all the extracts. The inhibitory effect was compared with the standard drug ciprofloxacin and fluconazole. MIC and MBC for both extracts were also determined using the tube dilution method. This study concluded that the root bark of G. boveanum, used traditionally as a medicinal plant, has antimicrobial activities against some causative organisms.

Keywords: Glossonema boveanum (Decne.), phytochemical, antimicrobial, minimum inhibitory concentration, minimum bactericidal concentration

Procedia PDF Downloads 266
26138 The Thinking of Dynamic Formulation of Rock Aging Agent Driven by Data

Authors: Longlong Zhang, Xiaohua Zhu, Ping Zhao, Yu Wang

Abstract:

The construction of mines, railways, highways, water conservancy projects, etc., have formed a large number of high steep slope wounds in China. Under the premise of slope stability and safety, the minimum cost, green and close to natural wound space repair, has become a new problem. Nowadays, in situ element testing and analysis, monitoring, field quantitative factor classification, and assignment evaluation will produce vast amounts of data. Data processing and analysis will inevitably differentiate the morphology, mineral composition, physicochemical properties between rock wounds, by which to dynamically match the appropriate techniques and materials for restoration. In the present research, based on the grid partition of the slope surface, tested the content of the combined oxide of rock mineral (SiO₂, CaO, MgO, Al₂O₃, Fe₃O₄, etc.), and classified and assigned values to the hardness and breakage of rock texture. The data of essential factors are interpolated and normalized in GIS, which formed the differential zoning map of slope space. According to the physical and chemical properties and spatial morphology of rocks in different zones, organic acids (plant waste fruit, fruit residue, etc.), natural mineral powder (zeolite, apatite, kaolin, etc.), water-retaining agent, and plant gum (melon powder) were mixed in different proportions to form rock aging agents. To spray the aging agent with different formulas on the slopes in different sections can affectively age the fresh rock wound, providing convenience for seed implantation, and reducing the transformation of heavy metals in the rocks. Through many practical engineering practices, a dynamic data platform of rock aging agent formula system is formed, which provides materials for the restoration of different slopes. It will also provide a guideline for the mixed-use of various natural materials to solve the complex, non-uniformity ecological restoration problem.

Keywords: data-driven, dynamic state, high steep slope, rock aging agent, wounds

Procedia PDF Downloads 112
26137 The Importance of Zakat in Struggle against Circle of Poverty and Income Redistribution

Authors: Hasan Bulent Kantarci

Abstract:

This paper examine how Zakat provide a fair income redistribution and struggle with poverty. To provide a fair income redistribution and struggle with poverty take place among the fundamental tasks of all countries. Each country seeks a solution for this problem according to their political, economical and administrative styles through applying various economic and financial policies. The same situation gets handled via zakat association in the Islam. Nowadays, we observe different versions of zakat in developed countries. The applications such as negative income tax denote merely a difference from the zakat being applied almost the same way under changed names. But the minimum values to donate the zakat (e.g. 85 gr. gold and 40 animals) get altered and various amounts are put into practice. It might be named as negative income tax instead of zakat, nonetheless, these applications are based on the Holy Koran and the hadith released 1400 years ago. Besides, considering the savage and slavery in the world at those times, we might easily recognize the true value of the zakat applied the first time then in Islamic system. Through zakat is enabled an income transfer by the government so that the poor could reach the minimum level of life standard. To whom the zakat would be donated was not left to people’s heart and encouraged to determine according to objective criteria. Since the zakat is obligatory, the transfer do not get forward by hand but via the government and get distributed, which requires a vast government organization. Through applying the zakat as it must be would achieve to reduce the poverty mostly and ensuring the fair income redistribution.

Keywords: Islamic finance, zakat, income redistribution, circle of poverty, negatif income tax

Procedia PDF Downloads 345
26136 Designing and Implementation of MPLS Based VPN

Authors: Muhammad Kamran Asif

Abstract:

MPLS stands for Multi-Protocol Label Switching. It is the technology which replaces ATM (Asynchronous Transfer Mode) and frame relay. In this paper, we have designed a full fledge small scale MPLS based service provider network core network model, which provides communication services (e.g. voice, video and data) to the customer more efficiently using label switching technique. Using MPLS VPN provides security to the customers which are either on LAN or WAN. It protects its single customer sites from being attacked by any intruder from outside world along with the provision of concept of extension of a private network over an internet. In this paper, we tried to implement a service provider network using minimum available resources i.e. five 3800 series CISCO routers comprises of service provider core, provider edge routers and customer edge routers. The customers on the one end of the network (customer side) is capable of sending any kind of data to the customers at the other end using service provider cloud which is MPLS VPN enabled. We have also done simulation and emulation for the model using GNS3 (Graphical Network Simulator-3) and achieved the real time scenarios. We have also deployed a NMS system which monitors our service provider cloud and generates alarm in case of any intrusion or malfunctioning in the network. Moreover, we have also provided a video help desk facility between customers and service provider cloud to resolve the network issues more effectively.

Keywords: MPLS, VPN, NMS, ATM, asynchronous transfer mode

Procedia PDF Downloads 330
26135 Analysis and Identification of Different Factors Affecting Students’ Performance Using a Correlation-Based Network Approach

Authors: Jeff Chak-Fu Wong, Tony Chun Yin Yip

Abstract:

The transition from secondary school to university seems exciting for many first-year students but can be more challenging than expected. Enabling instructors to know students’ learning habits and styles enhances their understanding of the students’ learning backgrounds, allows teachers to provide better support for their students, and has therefore high potential to improve teaching quality and learning, especially in any mathematics-related courses. The aim of this research is to collect students’ data using online surveys, to analyze students’ factors using learning analytics and educational data mining and to discover the characteristics of the students at risk of falling behind in their studies based on students’ previous academic backgrounds and collected data. In this paper, we use correlation-based distance methods and mutual information for measuring student factor relationships. We then develop a factor network using the Minimum Spanning Tree method and consider further study for analyzing the topological properties of these networks using social network analysis tools. Under the framework of mutual information, two graph-based feature filtering methods, i.e., unsupervised and supervised infinite feature selection algorithms, are used to analyze the results for students’ data to rank and select the appropriate subsets of features and yield effective results in identifying the factors affecting students at risk of failing. This discovered knowledge may help students as well as instructors enhance educational quality by finding out possible under-performers at the beginning of the first semester and applying more special attention to them in order to help in their learning process and improve their learning outcomes.

Keywords: students' academic performance, correlation-based distance method, social network analysis, feature selection, graph-based feature filtering method

Procedia PDF Downloads 129
26134 Efficacy of Hemi-Facetectomy in Treatment of Lumbar Foraminal Stenosis

Authors: Manoj Deepak, N. Mathivanan, K. Venkatachalam

Abstract:

Nerve root stenosis is one of the main cause for back pain. There are many methods both conservative and surgical to treat this disease. It is pertinent to decompress the spine to a proper extent so as to avoid the recurrence of symptoms. But too much of an aggressive approach also has its disadvantages. We present one of the methods to effectively decompress the nerve with better results. Our study was carried out in 52 patients with foramina stenosis between 2008 to 2011.We carried out the surgical procedure of shaving off the medial part of the facet joint so as to decompress the root. We selected those patients who had symptoms of claudication for more than 2 years. They had no signs of instability and they underwent conservative treatment for a period of 2 months before the procedure. Oswersty scoring was used to record the functional level of the patient before and after the procedure. All patients were followed up for a period of minimum 2.5 years. After evaluation for a minimum of 2.5 years, 34 patients had no evidence of recurrence of symptoms with improvement in the functional level.7 patients complained of minimal pain but their functional quality had improved postop. Six patients had symptoms of lumbar canal disease which reduced with conservative treatment. 5 patients required spinal fusion surgeries in the later period. Conclusion: Thus, we can effectively conclude that our procedure is safe and effective in reducing the symptoms in those patients with neurogenic claudication.

Keywords: facetectoemy, stenosis, decompression, Lumbar Foraminal Stenosis, hemi-facetectomy

Procedia PDF Downloads 350
26133 Performance Evaluation of Iar Multi Crop Thresher

Authors: Idris Idris Sunusi, U.S. Muhammed, N.A. Sale, I.B. Dalha, N.A. Adam

Abstract:

Threshing efficiency and mechanical grain damages are among the important parameters used in rating the performance of agricultural threshers. To be acceptable to farmers, threshers should have high threshing efficiency and low grain. The objective of the research is to evaluate the performances of the thresher using sorghum and millet, the performances parameters considered are; threshing efficiency and mechanical grain damage. For millet, four drum speed levels; 700, 800, 900 and 1000 rpm were considered while for sorghum; 600, 700, 800 and 900 rpm were considered. The feed rate levels were 3, 4, 5 and 6 kg/min for both sorghum and millet; the levels of moisture content were 8.93 and 10.38% for sorghum and 9.21 and 10.81% for millet. For millet the test result showed a maximum of 98.37 threshing efficiencies and a minimum of 0.24% mechanical grain damage while for sorghum the test result indicated a maximum of 99.38 threshing efficiencies, and a minimum of 0.75% mechanical grain damage. In comparison to the previous thresher, the threshing efficiency and mechanical grain damage of the modified machine has improved by 2.01% and 330.56% for millet and 5.31%, 287.64% for sorghum. Also analysis of variance (ANOVA) showed that, the effect of drum speed, feed rate and moisture content were significant on the performance parameters.

Keywords: Threshing Efficiency, Mechanical Grain Damages, Sorghum and Millet, Multi Crop Thresher

Procedia PDF Downloads 348
26132 Rice Area Determination Using Landsat-Based Indices and Land Surface Temperature Values

Authors: Burçin Saltık, Levent Genç

Abstract:

In this study, it was aimed to determine a route for identification of rice cultivation areas within Thrace and Marmara regions of Turkey using remote sensing and GIS. Landsat 8 (OLI-TIRS) imageries acquired in production season of 2013 with 181/32 Path/Row number were used. Four different seasonal images were generated utilizing original bands and different transformation techniques. All images were classified individually using supervised classification techniques and Land Use Land Cover Maps (LULC) were generated with 8 classes. Areas (ha, %) of each classes were calculated. In addition, district-based rice distribution maps were developed and results of these maps were compared with Turkish Statistical Institute (TurkSTAT; TSI)’s actual rice cultivation area records. Accuracy assessments were conducted, and most accurate map was selected depending on accuracy assessment and coherency with TSI results. Additionally, rice areas on over 4° slope values were considered as mis-classified pixels and they eliminated using slope map and GIS tools. Finally, randomized rice zones were selected to obtain maximum-minimum value ranges of each date (May, June, July, August, September images separately) NDVI, LSWI, and LST images to test whether they may be used for rice area determination via raster calculator tool of ArcGIS. The most accurate classification for rice determination was obtained from seasonal LSWI LULC map, and considering TSI data and accuracy assessment results and mis-classified pixels were eliminated from this map. According to results, 83151.5 ha of rice areas exist within study area. However, this result is higher than TSI records with an area of 12702.3 ha. Use of maximum-minimum range of rice area NDVI, LSWI, and LST was tested in Meric district. It was seen that using the value ranges obtained from July imagery, gave the closest results to TSI records, and the difference was only 206.4 ha. This difference is normal due to relatively low resolution of images. Thus, employment of images with higher spectral, spatial, temporal and radiometric resolutions may provide more reliable results.

Keywords: landsat 8 (OLI-TIRS), LST, LSWI, LULC, NDVI, rice

Procedia PDF Downloads 227
26131 Estimation of Pressure Loss Coefficients in Combining Flows Using Artificial Neural Networks

Authors: Shahzad Yousaf, Imran Shafi

Abstract:

This paper presents a new method for calculation of pressure loss coefficients by use of the artificial neural network (ANN) in tee junctions. Geometry and flow parameters are feed into ANN as the inputs for purpose of training the network. Efficacy of the network is demonstrated by comparison of the experimental and ANN based calculated data of pressure loss coefficients for combining flows in a tee junction. Reynolds numbers ranging from 200 to 14000 and discharge ratios varying from minimum to maximum flow for calculation of pressure loss coefficients have been used. Pressure loss coefficients calculated using ANN are compared to the models from literature used in junction flows. The results achieved after the application of ANN agrees reasonably to the experimental values.

Keywords: artificial neural networks, combining flow, pressure loss coefficients, solar collector tee junctions

Procedia PDF Downloads 388
26130 Anti-Methicillin-Resistant Staphylococcus aureus (MRSA) Compounds from Bauhinia kockiana Korth and Their Mechanism of Antibacterial Activity

Authors: Yik Ling Chew, Adlina Maisarah Mahadi, Joo Kheng Goh

Abstract:

Bauhinia kockiana originates from Peninsular Malaysia, and it is grown as a garden ornamental plant. However, it is used as medicinal plant by Malaysia ‘Kelabit’ ethic group in treating various diseases and illnesses. This study focused on the assessment of the antibacterial activity of B. kockiana towards MRSA, to purify and identify the antibacterial compounds, and to determine the mechanism of antibacterial activity. Antibacterial activity of B. kockiana flower is evaluated qualitatively and quantitatively using disc diffusion assay and microbroth dilution method to determine the minimum inhibitory concentration (MIC) and minimum bactericidal concentration (MBC) of extracts. Phytochemical analysis is performed to determine the classes of phytochemicals in the extracts. Bioactivity-guided isolation is performed to purify the antibacterial agents and identified the chemical structures via various spectroscopy methods. Scanning electron microscopy (SEM) technique is adopted to evaluate the antibacterial mechanism of extract and compounds isolated. B. kockiana flower is found to exhibit fairly strong antibacterial activity towards both strains of MRSA bacteria. Gallic acid and its ester derivatives are purified from ethyl acetate extract and the antibacterial activity is evaluated. SEM has revealed the mechanism of the extracts and compounds isolated.

Keywords: alkyl gallates, Bauhinia kockiana, MRSA, scanning electron microscopy

Procedia PDF Downloads 365